BACKGROUND: Resident evaluation is a complex and challenging task, and little is known about what assessment methods, predominate within or across specialties. AIMS: To determine the methods program directors in Canada use to assess residents and their perceptions of how evaluation could be improved. METHODS: We conducted a web-based survey of program directors from The Royal College of Physicians and Surgeons of Canada (RCPSC)-accredited training programs, to examine the use of the In-Training Evaluation Report (ITER), the use of non-ITER tools and program directors' perceived needs for improvement in evaluation methods. RESULTS: One hundred forty-nine of the eligible 280 program directors participated in the survey. ITERs were used by all but one program. Of the non-ITER tools, multiple choice questions (71.8%) and oral examinations (85.9%) were most utilized, whereas essays (11.4%) and simulations (28.2%) were least used across all specialties. Surgical specialties had significantly higher multiple choice questions and logbook utilization, whereas medical specialties were significantly more likely to include Objective Stuctured Clinical Examinations (OSCEs). Program directors expressed a strong need for national collaboration between programs within a specialty to improve the resident evaluation processes. CONCLUSIONS: Program directors use a variety of methods to assess trainees. They continue to rely heavily on the ITER, but are using other tools.
BACKGROUND: Resident evaluation is a complex and challenging task, and little is known about what assessment methods, predominate within or across specialties. AIMS: To determine the methods program directors in Canada use to assess residents and their perceptions of how evaluation could be improved. METHODS: We conducted a web-based survey of program directors from The Royal College of Physicians and Surgeons of Canada (RCPSC)-accredited training programs, to examine the use of the In-Training Evaluation Report (ITER), the use of non-ITER tools and program directors' perceived needs for improvement in evaluation methods. RESULTS: One hundred forty-nine of the eligible 280 program directors participated in the survey. ITERs were used by all but one program. Of the non-ITER tools, multiple choice questions (71.8%) and oral examinations (85.9%) were most utilized, whereas essays (11.4%) and simulations (28.2%) were least used across all specialties. Surgical specialties had significantly higher multiple choice questions and logbook utilization, whereas medical specialties were significantly more likely to include Objective Stuctured Clinical Examinations (OSCEs). Program directors expressed a strong need for national collaboration between programs within a specialty to improve the resident evaluation processes. CONCLUSIONS: Program directors use a variety of methods to assess trainees. They continue to rely heavily on the ITER, but are using other tools.
Authors: Tomce Trajkovski; Christian Veillette; David Backstein; Veronica M R Wadey; Bill Kraemer Journal: Can J Surg Date: 2012-08 Impact factor: 2.089
Authors: Andrew Koch Hall; J Damon Dagnone; Sean Moore; Karen G H Woolfrey; John A Ross; Gordon McNeil; Carly Hagel; Colleen Davison; Stefanie S Sebok-Syer Journal: AEM Educ Train Date: 2017-09-19
Authors: Katrin Schüttpelz-Brauns; Elisabeth Narciss; Claudia Schneyinck; Klaus Böhme; Peter Brüstle; Ulrike Mau-Holzmann; Maria Lammerding-Koeppel; Udo Obertacke Journal: Med Teach Date: 2016-02-03 Impact factor: 3.650