M C Duerson1, L J Romrell, C B Stevens. 1. Harrell Professional Development and Assessment Center, University of Florida, P.O. Box 100281, Gainesville, FL 32610-0281, USA. mduerson@dean.med.ufl.edu
Abstract
BACKGROUND: The impetus for administering the 2nd-year Objective Structured Clinical Examination (OSCE) came from the great variability in student performance observed by 3rd-year clerkship directors. PURPOSE: To document the effects of the OSCE on faculty teaching, student performance, and the curriculum over 9 years of administration of the examinations to more than 1,000 second-year medical students. METHOD: A 20-station OSCE was administered to all medical students at the end of their 2nd year. Using predetermined criteria, clinical faculty served as evaluators in each station. A mix of 1st-, 3rd-, and 4th-year medical students were recruited to serve as simulated patients. Faculty evaluators and examinees completed a questionnaire evaluating their experience with the OSCE. Students received a report card of their performance. Small-group leaders of the Introduction to Clinical Medicine course received feedback on their group's performance on each station compared to the class mean. Summative data on class performance was reported to the curriculum committee. The academic status committee received data on students who performed unsatisfactorily. RESULTS: Faculty and examinee ratings of the OSCE experience were very positive. Over the 9-year period, student performance improved showing less variability and significantly fewer failed stations. CONCLUSION: The OSCE has proven to be a technically feasible, authentic evaluation method yielding valuable information for decisions regarding student performance, faculty teaching, and curriculum planning.
BACKGROUND: The impetus for administering the 2nd-year Objective Structured Clinical Examination (OSCE) came from the great variability in student performance observed by 3rd-year clerkship directors. PURPOSE: To document the effects of the OSCE on faculty teaching, student performance, and the curriculum over 9 years of administration of the examinations to more than 1,000 second-year medical students. METHOD: A 20-station OSCE was administered to all medical students at the end of their 2nd year. Using predetermined criteria, clinical faculty served as evaluators in each station. A mix of 1st-, 3rd-, and 4th-year medical students were recruited to serve as simulated patients. Faculty evaluators and examinees completed a questionnaire evaluating their experience with the OSCE. Students received a report card of their performance. Small-group leaders of the Introduction to Clinical Medicine course received feedback on their group's performance on each station compared to the class mean. Summative data on class performance was reported to the curriculum committee. The academic status committee received data on students who performed unsatisfactorily. RESULTS: Faculty and examinee ratings of the OSCE experience were very positive. Over the 9-year period, student performance improved showing less variability and significantly fewer failed stations. CONCLUSION: The OSCE has proven to be a technically feasible, authentic evaluation method yielding valuable information for decisions regarding student performance, faculty teaching, and curriculum planning.
Authors: Russell B Pierre; Andrea Wierenga; Michelle Barton; J Michael Branday; Celia D C Christie Journal: BMC Med Educ Date: 2004-10-16 Impact factor: 2.463