CONTEXT: The dissemination of objective structured clinical examinations (OSCEs) is hampered by requirements for high levels of staffing and a significantly higher workload compared with multiple-choice examinations. Senior medical students may be able to support faculty staff to assess their peers. The aim of this study is to assess the reliability of student tutors as OSCE examiners and their acceptance by their peers. METHODS: Using a checklist and a global rating, teaching doctors (TDs) and student tutors (STs) simultaneously assessed students in basic clinical skills at 4 OSCE stations. The inter-rater agreement between TDs and STs was calculated by kappa values and paired t-tests. Students then completed a questionnaire to assess their acceptance of student peer examiners. RESULTS: All 214 Year 3 students at the University of Göttingen Medical School were evaluated in spring 2005. Student tutors gave slightly better average grades than TDs (differences of 0.02-0.20 on a 5-point Likert scale). Inter-rater agreement at the stations ranged from 0.41 to 0.64 for checklist assessment and global ratings; overall inter-rater agreement on the final grade was 0.66. Most students felt that assessment by STs would result in the same grades as assessment by TDs (64%) and that it would be similarly objective (69%). Nearly all students (95%) felt confident that they could evaluate their peers themselves in an OSCE. CONCLUSIONS: On the basis of our results, STs can act as examiners in summative OSCEs to assess basic medical skills. The slightly better grades observed are of no practical concern. Students accepted assessment performed by STs.
CONTEXT: The dissemination of objective structured clinical examinations (OSCEs) is hampered by requirements for high levels of staffing and a significantly higher workload compared with multiple-choice examinations. Senior medical students may be able to support faculty staff to assess their peers. The aim of this study is to assess the reliability of student tutors as OSCE examiners and their acceptance by their peers. METHODS: Using a checklist and a global rating, teaching doctors (TDs) and student tutors (STs) simultaneously assessed students in basic clinical skills at 4 OSCE stations. The inter-rater agreement between TDs and STs was calculated by kappa values and paired t-tests. Students then completed a questionnaire to assess their acceptance of student peer examiners. RESULTS: All 214 Year 3 students at the University of Göttingen Medical School were evaluated in spring 2005. Student tutors gave slightly better average grades than TDs (differences of 0.02-0.20 on a 5-point Likert scale). Inter-rater agreement at the stations ranged from 0.41 to 0.64 for checklist assessment and global ratings; overall inter-rater agreement on the final grade was 0.66. Most students felt that assessment by STs would result in the same grades as assessment by TDs (64%) and that it would be similarly objective (69%). Nearly all students (95%) felt confident that they could evaluate their peers themselves in an OSCE. CONCLUSIONS: On the basis of our results, STs can act as examiners in summative OSCEs to assess basic medical skills. The slightly better grades observed are of no practical concern. Students accepted assessment performed by STs.
Authors: Carlos E Garcia Rodriguez; Raj J Shah; Cody Smith; Christopher J Gay; Jared Alvarado; Douglas Rappaport; William J Adamas-Rappaport; Richard Amini Journal: J Adv Med Educ Prof Date: 2019-04
Authors: Julia Freytag; Fabian Stroben; Wolf E Hautz; Stefan K Schauber; Juliane E Kämmer Journal: Scand J Trauma Resusc Emerg Med Date: 2019-02-08 Impact factor: 2.953
Authors: Gunther Weitz; Christian Vinzentius; Christoph Twesten; Hendrik Lehnert; Hendrik Bonnemeier; Inke R König Journal: GMS Z Med Ausbild Date: 2014-11-17
Authors: Simon Schwill; Johanna Fahrbach-Veeser; Andreas Moeltner; Christiane Eicher; Sonia Kurczyk; David Pfisterer; Joachim Szecsenyi; Svetla Loukanova Journal: BMC Med Educ Date: 2020-01-16 Impact factor: 2.463