School of Medicine Policy on Student Use of Generative AI in UME
Policy
1. Purpose and Philosophy
The purpose of this policy is to outline acceptable student use of generative artificial intelligence (AI) tools within the Undergraduate Medical Education (UME) curriculum. In medical school, the process of completing academic work is as important as the final product.
Writing assignments, clinical reflections and case analyses are designed to clarify thinking, promote reflective practice and foster the critical reasoning skills foundational to the practice of medicine. Therefore, medical students may not utilize generative AI and large language models as a substitute for their own knowledge acquisition, analysis and self-reflection.
2. Academic Coursework and Integrity
Define when AI use is permitted in coursework, require transparency when it is used, and uphold academic integrity and equitable access across learning activities.
Original Authorship: Required assignments for submission must be authored by students directly unless specific permission is given by the instructor to utilize generative AI. Using AI to generate content without permission represents plagiarism or misrepresentation of the source of academic work.
Course-Specific Policies: Faculty maintain the authority to permit, restrict or prohibit AI use in their specific courses, and must clearly state their expectations in the syllabus and assignment prompts.
Disclosure and Verification: When students are permitted or asked to use certain AI technologies, they must explicitly disclose which tools were used and how the tools were used. Students remain solely responsible for identifying and correcting any inaccuracies, fabrications or misinformation resulting from the use of AI tools. Failure to disclose information represents a violation of the tenets of professionalism and will be addressed per institutional policy.
Equity of Access: If an instructor assigns coursework that requires the use of generative AI, the course director must address limitations in access to premium (paid) AI tools to ensure equitable learning opportunities for all students.
3. Clinical Learning Environment and Patient Care
The use of AI in the clinical environment carries critical legal and safety implications.
Clinical Documentation: Medical students may not create History and Physicals or patient care notes using artificial intelligence applications outside of those officially supported and embedded within the clinical site's Electronic Health Record system (e.g., specific dot phrases or system-generated text).
Protected Health Information (PHI): Under no circumstances may students enter patient data or PHI into unauthorized, public-facing generative AI tools. Doing so is a direct violation of HIPAA and institutional privacy policies.
4. Research and Scholarship
When submitting scholarly work, research abstracts or manuscripts for publication or presentation, medical students must strictly adhere to the generative AI policies set forth by the target journals and professional organizations. Students must transparently disclose when and how these tools have been used in their research process.
5. Violations and Enforcement
Unauthorized use of generative AI in preclinical coursework, clerkship documentation or research is considered a breach of academic integrity and professionalism. Suspected violations will be reported to the [Student Conduct Board / Academic Promotions Committee] and evaluated under the [Enforcement of University Behavioral Standards Policy / Code of Academic Integrity]. As described in clerkship orientations, utilizing unauthorized AI for patient notes may be grounds for failing a clerkship.
Sample Email for Rolling Out AI Policy
Subject: Important Update: New Policy on Student Use of Generative AI in the UME Curriculum
Dear Medical Students, Faculty and Staff,
As generative artificial intelligence (AI) tools become increasingly integrated into the healthcare landscape, [Institution Name] is committed to preparing our future physicians to use these technologies responsibly, safely and ethically. To ensure we maintain the highest standards of academic integrity and patient privacy, we have developed a new Policy on Student Use of Generative AI in Undergraduate Medical Education.
You can review the full policy in the student handbook here: [Link to Policy].
While we encourage the exploration of AI as a supplemental learning tool, it is vital to remember that AI cannot replace the deep knowledge acquisition, critical reasoning and self-reflection required to become a competent physician.
Key Takeaways for Medical Students:
Preclinical Coursework: You may only use generative AI for assignments if your instructor has explicitly granted permission. If permitted, you must properly cite your use of the tool and verify all AI-generated information for accuracy. Unauthorized use is considered an academic integrity violation.
Clinical Clerkships: You are strictly prohibited from using external generative AI applications (like public ChatGPT) to write History & Physicals or patient notes. Notes may only be generated using the approved tools natively embedded within the clinic’s Electronic Health Record system.
Patient Privacy: Protected Health Information must never be entered into unapproved AI platforms.
Guidance for Faculty:
Instructors are asked to clearly communicate their AI expectations in course syllabi and assignment instructions. If you require the use of an AI tool for a specific assignment, please ensure all students have equitable access to the platform (e.g., providing alternatives to premium/paid subscriptions).
As we navigate this new era of medical education together, we will be hosting an upcoming session, "Generative AI in UME: Best Practices and Boundaries," on [Date/Time] to answer your questions and discuss approved use cases.
Thank you for your commitment to professionalism, patient safety and educational excellence.
Sincerely,
[Name] [Title, e.g., Dean of Medical Education / Associate Dean for Student Affairs] [Institution Name]
Announcement Email Template
Subject: New Guidance: Faculty and Preceptor Use of AI in Teaching and Assessment
Dear Faculty and Clinical Preceptors,
As generative artificial intelligence (AI) continues to reshape the landscape of higher education and healthcare, we recognize the incredible potential these tools hold to support your teaching, streamline course preparation and enhance the learning environment. To support you in navigating this new technology safely and ethically, the Office of the Provost is releasing our new Policy on Faculty and Preceptor Use of AI in Teaching and Assessment.
You can review the complete guidelines here: [Link to Policy].
Our goal is to empower you to innovate while safeguarding student privacy, academic rigor and the irreplaceable value of human mentorship.
Key Highlights for Educators:
Course Preparation: You are encouraged to use AI as a "co-pilot" to help draft lesson plans, brainstorm active learning exercises or create clinical vignettes. However, you remain fully responsible for fact-checking and ensuring the accuracy and lack of bias in any AI-generated materials you distribute.
Setting Student Expectations: You have the autonomy to dictate how AI may or may not be used by students in your courses. You are required to include a clear, explicit AI usage policy in your course syllabus and assignment prompts.
Protecting Student Data (FERPA): Student assignments, grades and personally identifiable information must never be uploaded into unauthorized, public-facing AI tools for grading or feedback. AI cannot replace your academic judgment in summative grading.
Rethinking AI Detection: Because current AI detection tools are unreliable and prone to false positives (especially for non-native English speakers), they should not be used as the sole evidence for academic misconduct. We encourage faculty to focus on designing authentic, "AI-resistant" assessments that emphasize critical thinking and the learning process.
Support and Resources:
We know that adapting to these changes requires time and support. The Center for Teaching and Learning will be offering a series of workshops this semester focused on redesigning assessments, drafting syllabus policies and exploring approved AI tools.
If you have immediate questions or need assistance crafting an AI syllabus statement tailored to your specific clinical or didactic course, please contact [Email Address/Contact Person].
Thank you for your continued dedication to our students and for your thoughtful approach to integrating these powerful new tools into our educational mission.
Sincerely,
[Name] [Title, e.g., Provost / Chief Academic Officer] [Institution Name]