AI for Teaching and Learning | Syllabus Policy Language |
Writing AI Syllabus Policies: A Practical Guide for Notre Dame Faculty
This practical guide helps Notre Dame instructors craft clear, student-centered syllabus policies for generative AI, offering customizable templates, transparency strategies, and alignment with institutional values.
If students don’t know whether they can use AI tools, they’re left guessing—and that can hold them back. A clear AI policy in your syllabus takes away the guesswork. It tells students what’s okay, what’s not, and how AI can (or can’t) support their learning. Everyone gets the same rules, the same opportunities, and a fair shot at success.
To write such a policy, it helps to think in layers. Expectations work best when they are consistent across levels but tailored to context. At Notre Dame, that means aligning with University guidance, setting a course-level stance, and clarifying details for each assignment.
This guide will focus on course-level AI policies, but it is helpful to begin by articulating their relationship to the other two layers.
Design at Three Levels: University –> Course –> Assignment
AI policies live at multiple levels. At ND, align your course policy with University Honor Code language, then tailor at the assignment level. This keeps expectations coherent while making room for disciplinary differences.
- University: Link to the Honor Code and any campus AI guidance; use common disclosure language where possible.
- Course: State your general stance and the rationale tied to learning outcomes in your syllabus; note privacy considerations.
- Assignment: Specify what’s allowed for this task (with examples), what must be cited, and the required disclosure format.
Taken together, placing these layers of the AI policy in the syllabus consolidates expectations, rationale, and resources, ensuring every student has access to this crucial information in one cohesive document.
We’ll turn now to the course-level AI policy in your syllabus. For assistance at the assignment level, see our guide on Enhancing Assignments with AI Transparency.
Choose Your Course-Level AI Stance: Three-Spectrum Model with Sample Language
Notre Dame uses a Closed/Strategic/Open framework to help faculty articulate their stance on AI use in courses. This model was developed locally and builds on broader teaching scholarship, particularly James Lang’s spectrum of technology use in the classroom. Below are short clauses you can adapt aligned with this model. Specify how particular assignments relate to your course-level policy so expectations remain clear.
1. Closed (Total Ban)
Policy statement: Generative AI tools (e.g., ChatGPT, Gemini, Copilot) may not be used for any part of this course. Using AI to generate ideas, outlines, code, or prose is considered unauthorized assistance under the ND Honor Code. Ask if you are unsure—clarity is part of learning.
Example: In an exam-like take-home proof assignment, prohibit any AI brainstorming or code generation; require students to upload handwritten work plus a brief oral check (five minutes in office hours or Zoom) explaining one step of their reasoning.
2. Strategic (Context-Specific)
Policy statement: You may use generative AI for brainstorming and outlining only. Any text included in your submission must be your original writing. If you consult AI, include a short “AI use” note at the end: list the tool, date, prompt(s), and how you used the output. Directly pasted AI text is not permitted.
Example: In a policy brief, allow students to use AI to generate opposing viewpoints and an outline, but require their final text to be original and supported by peer-reviewed or canonical sources they locate independently.
3. Open (Laissez-Faire)
Policy statement: You are encouraged to use generative AI as a collaborator for idea generation, drafting, and revision. You remain responsible for accuracy, originality, and citation. Include an “AI use” note (tool, date, prompt(s), and contribution). When quoting or closely paraphrasing AI output, mark it as such and provide attribution.
Example: In a software studio, invite students to use AI to refactor code or generate tests, paired with a repository log and a reflective memo on what they accepted or rejected and why.
For additional ready-to-use examples, see this list of dozens of crowdsourced Syllabi Policies for AI Generative Tools.
Accessibility and Privacy
Here are a few other factors to keep in mind from other accessibility and general technology components of your syllabus:
- Provide no-cost or institutionally supported options when feasible.
- Accept non-AI pathways for students who opt out.
- Avoid requiring students to upload sensitive or proprietary data to public tools; remind them that public models may retain prompts.
- Offer alternatives for students using assistive technologies; ensure captioning and screen-reader compatibility in AI-enabled workflows.
- Teach verification: Fact-check AI outputs, check citations, and use discipline-specific standards of evidence.
Next Steps
- Review the official Notre Dame University-level policy on generative AI in education: Generative AI Usage Policy for Students.
- Browse sample syllabi collection to get ideas for how others are approaching AI.
-
Craft your own course-level policy, selecting one of the following approaches:
- Closed: No use of AI tools permitted.
- Strategic: Use of AI is allowed only for certain tasks or assignments.
- Open: Use of AI is encouraged with transparency.
- Communicate your policy clearly in your syllabus and on the first day of class.
- Seek support from ND Learning or your colleagues if you’d like help adapting or explaining your policy.
ND Related Resources
Other Related Resources
- Stanford Teaching Commons – Creating Your Course Policy on AI
- UT Austin CTL – Sample AI Syllabus Policy Statements
- Cornell CTI – AI & Academic Integrity (sample course policy language)
- Carnegie Mellon Eberly – Examples of Academic-Integrity Policies Addressing GenAI
- John Hopkins University – AI Syllabus Sample Statements
Cite this guide:
Hsu, Kuangchen, Ambrose, G. Alex (2025). Writing AI syllabus policies: A practical guide for Notre Dame faculty. University of Notre Dame, Center for Teaching and Learning.
Questions or consults? Contact ND Learning for individualized support.