Post-exam feedback with question rationales improves re-test performance of medical students on a multiple-choice exam

Advances in health sciences education : theory and practice(2018)

引用 17|浏览8
暂无评分
摘要
This study compared the effects of two types of delayed feedback (correct response or correct response + rationale) provided to students by a computer-based testing system following an exam. The preclinical medical curriculum at the University of Kansas Medical Center uses a two-exam system for summative assessments in which students test, revisit material, and then re-test (same content, different questions), with the higher score used to determine the students’ grades. Using a quasi-experimental design and data collected during the normal course of instruction, test and re-test scores from midterm multiple choice examinations were compared between academic year (AY) 2015–2016, which provided delayed feedback with the correct answer only, and AY 2016–2017, where delayed feedback consisted of the correct answer plus a rationale. The average increase in score on the re-test was 2.29 ± 6.83% (n = 192) with correct answer only and 3.92 ± 7.12% (n = 197) with rationales ( p < 0.05). The effect of the rationales was not different in students of differing academic abilities based on entering composite MCAT scores or Year 1 GPA. Thus, delayed feedback with exam question rationales resulted in a greater increase in exam score between the test and re-test than feedback with correct response only. This finding suggests that delayed elaborative feedback on a summative exam produced a small, but significant, improvement in learning, in medical students.
更多
查看译文
关键词
Delayed feedback,Elaborative feedback,Medical student,Multiple-choice question,Computer-based testing,Summative assessment
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要