BACKGROUND: Faculty observation of residents and students performing clinical skills is essential for reliable and valid evaluation of trainees. OBJECTIVE: To evaluate the efficacy of a new multifaceted method of faculty development called direct observation of competence training. DESIGN: Controlled trial of faculty from 16 internal medicine residency programs using a cluster randomization design. SETTING:Academic medical centers. PARTICIPANTS: 40 internal medicine teaching faculty members: 17 in the intervention group and 23 in the control group. MEASUREMENTS: Changes in faculty comfort performing direct observation, faculty satisfaction with workshop, and changes in faculty rating behaviors 8 months after completing the training. INTERVENTION: The direct observation of competence workshop combines didactic mini-lectures, interactive small group and videotape evaluation exercises, and evaluation skill practice with standardized residents and patients. RESULTS:37 faculty members (16 in the intervention group and 21 in the control group) completed the study. Most of the faculty in the intervention group (14 [88%]) reported that they felt significantly more comfortable performing direct observation compared with control group faculty (4 [19%]) (P = 0.04), and all intervention faculty rated the training as outstanding. For 9 videotaped clinical encounters, intervention group faculty were more stringent than controls in their evaluations of medical interviewing, physical examination, and counseling; differences in ratings for medical interviewing and physical examination remained statistically significant even after adjustment for baseline rating behavior. LIMITATIONS: The study involved a limited number of residency programs, and faculty did not rate the performance of actual residents. CONCLUSION: Direct observation of competence training, a new multifaceted approach to faculty development, leads to meaningful changes in rating behaviors and in faculty comfort with evaluation of clinical skills.
RCT Entities:
BACKGROUND: Faculty observation of residents and students performing clinical skills is essential for reliable and valid evaluation of trainees. OBJECTIVE: To evaluate the efficacy of a new multifaceted method of faculty development called direct observation of competence training. DESIGN: Controlled trial of faculty from 16 internal medicine residency programs using a cluster randomization design. SETTING: Academic medical centers. PARTICIPANTS: 40 internal medicine teaching faculty members: 17 in the intervention group and 23 in the control group. MEASUREMENTS: Changes in faculty comfort performing direct observation, faculty satisfaction with workshop, and changes in faculty rating behaviors 8 months after completing the training. INTERVENTION: The direct observation of competence workshop combines didactic mini-lectures, interactive small group and videotape evaluation exercises, and evaluation skill practice with standardized residents and patients. RESULTS: 37 faculty members (16 in the intervention group and 21 in the control group) completed the study. Most of the faculty in the intervention group (14 [88%]) reported that they felt significantly more comfortable performing direct observation compared with control group faculty (4 [19%]) (P = 0.04), and all intervention faculty rated the training as outstanding. For 9 videotaped clinical encounters, intervention group faculty were more stringent than controls in their evaluations of medical interviewing, physical examination, and counseling; differences in ratings for medical interviewing and physical examination remained statistically significant even after adjustment for baseline rating behavior. LIMITATIONS: The study involved a limited number of residency programs, and faculty did not rate the performance of actual residents. CONCLUSION: Direct observation of competence training, a new multifaceted approach to faculty development, leads to meaningful changes in rating behaviors and in faculty comfort with evaluation of clinical skills.
Authors: Michael L Green; Eva M Aagaard; Kelly J Caverzagie; Davoren A Chick; Eric Holmboe; Gregory Kane; Cynthia D Smith; William Iobst Journal: J Grad Med Educ Date: 2009-09
Authors: Payel Roy; Angela H Jackson; Jeffrey Baxter; Belle Brett; Michael Winter; Ilana Hardesty; Daniel P Alford Journal: Pain Med Date: 2019-04-01 Impact factor: 3.750
Authors: Martin G Tolsgaard; Sebastian Bjørck; Maria B Rasmussen; Amandus Gustafsson; Charlotte Ringsted Journal: J Gen Intern Med Date: 2013-08 Impact factor: 5.128