Skip to content
AI and Teaching

In Case You Missed It: Teaching in the Age of AI Kickoff Panel

September 12, 2023
ND Learning
two building blocks spelling Ai sitting atop an image of a computer circuit board

By Amanda Leary

On August 16, we hosted a pre-semester panel designed to help faculty prepare for a school year challenged by the rise of AI. The panel, co-sponsored by Teaching and Learning Technologies, the Office of Academic Standards, and the University Writing Program, featured demonstrations, policy guidance, and advice from instructors across the university about adjusting to teaching in the Age of AI. In case you missed it, here are our top takeaways from the panel:

#1: AI is an opportunity to reexamine teaching practices.

Generative AI is changing the way students think about their work and the role of that work in their learning. It’s on us as instructors to respond to these changes—will we double down on traditional or outdated methods, or rise to the challenge and adapt our teaching to this new reality? Now is a perfect time to rethink our learning goals, our assignments, and how students are demonstrating their learning. If ChatGPT can do it, should we really ask our students to?

#2: Check your settings.

AI platforms have varying standards for user privacy, storing input, and collecting profile information. Some require a log-in, others may let you opt out of storing your history. This is an evolving landscape, so before inputting anything into the tool, familiarize yourself with the privacy settings.

#3: Students are receiving guidance, too.

The Office of Academic Standards has communicated with students about how using generative AI fits into Notre Dame’s undergraduate Honor Code. It’s important to note that there are ways to use AI to supplement—not replace—learning course material, such as creating study guides and flashcards, that do not violate the Honor Code.

#4: AI detectors are unreliable. 

AI detectors are generally unreliable and frequently return “false positives”—particularly for non-native English speakers. The University does not endorse or support any AI detection tool and recommends against their use. If you’re concerned that a student’s work was generated by AI, a more productive option might be a conversation with the student about their writing process, sources, the details of their arguments, etc.

#5: You can still assign writing. 

Despite the pervasiveness of chatbots, writing is still an effective way to assess student learning—and not in a “timed, in-class, handwritten essay” way. If you’re concerned about students completing writing assignments with generative AI, there are a few best practices for designing more effective writing assignments: 1) emphasize process over product; 2) incorporate students’ unique voices and perspectives; 3) focus on developing skills rather than demonstrating mastery of content.

#6: Be transparent about your AI policy.

Whether you’re allowing students to use AI in your course or not, it is important to communicate your expectations for student work. When and how will students be allowed to use AI? Should they cite it? If not at all, what are the consequences for using generative AI to complete assignments? Our “Generative AI Syllabus Policy Language” offers guidance on developing a policy for your course.

Resources:

Related: