What to Do After the Test
You’ve graded the exam, now what? Did the class perform worse than you expected? How do you know if you wrote a bad question or if your students weren’t prepared? In this workshop, we’ll focus on analysis of exam questions, potential adjustment of student grades, and reflection (encouraging student metacognition through exam wrappers as well as instructor notes for next time).
After successfully completing this workshop you will be able to:
- Apply basic item analysis to your exams
- Apply appropriate adjustments to exam scores
- Prompt student reflection and improve metacognition with exam wrappers
- Use data and reflections to inform future instruction and assessment
Ambrose, G. Alex, Duan, Xiaojing, Kanczuzewski, Kael, Young, Kelley M., & Gezelter, J. Daniel (2019) “Exams Evaluate Students: Who’s Evaluating Exams? Data-Informed Exam Design”
Duan, Xiaojing, Ambrose, G. Alex, Wang, Chaoli, Abbott, Kevin, Woodard, Victoria, Schalk, Catlin (2020) PerformanceVis: Homework & Exam Analytics Dashboard for Inclusive Student Success.
Deng, H., Wang, X., Guo, Z., Decker, A., Duan, X., Wang, C., Ambrose, G., & Abbott, K. (2019). PerformanceVis: Visual analytics of student performance data from an introductory chemistry course.
Klein, T. W. (1990). Characteristics Which Differentiate Criterion-Referenced from Norm-Referenced Tests.
Mynlieff, Michelle et al. “Writing Assignments with a Metacognitive Component Enhance Learning in a Large Introductory Biology Course.”