OBJECTIVE: We sought to develop and validate a three-station simulation-based Objective Structured Clinical Examination (OSCE) tool to assess emergency medicine resident competency in resuscitation scenarios. METHODS: An expert panel of emergency physicians developed three scenarios for use with high-fidelity mannequins. For each scenario, a corresponding assessment tool was developed with an essential actions (EA) checklist and a global assessment score (GAS). The scenarios were (1) unstable ventricular tachycardia, (2) respiratory failure, and (3) ST elevation myocardial infarction. Emergency medicine residents were videotaped completing the OSCE, and three clinician experts independently evaluated the videotapes using the assessment tool. RESULTS: Twenty-one residents completed the OSCE (nine residents in the College of Family Physicians of Canada-Emergency Medicine [CCFP-EM] program, six junior residents in the Fellow of the Royal College of Physicians of Canada-Emergency Medicine [FRCP-EM] program, six senior residents in the FRCP-EM). Interrater reliability for the EA scores was good but varied between scenarios (Spearman rho = [1] 0.68, [2] 0.81, [3] 0.41). Interrater reliability for the GAS was also good, with less variability (rho = [1] 0.64, [2] 0.56, [3] 0.62). When comparing GAS scores, senior FRCP residents outperformed CCFP-EM residents in all scenarios and junior residents in two of three scenarios (p < 0.001 to 0.01). Based on EA scores, senior FRCP residents outperformed CCFP-EM residents, but junior residents outperformed senior FRCP residents in scenario 1 and CCFP-EM residents in all scenarios (p = 0.006 to 0.04). CONCLUSION: This study outlines the creation of a high-fidelity simulation assessment tool for trainees in emergency medicine. A single-point GAS demonstrated stronger relational validity and more consistent reliability in comparison with an EA checklist. This preliminary work will provide a foundation for ongoing future development of simulation-based assessment tools.
OBJECTIVE: We sought to develop and validate a three-station simulation-based Objective Structured Clinical Examination (OSCE) tool to assess emergency medicine resident competency in resuscitation scenarios. METHODS: An expert panel of emergency physicians developed three scenarios for use with high-fidelity mannequins. For each scenario, a corresponding assessment tool was developed with an essential actions (EA) checklist and a global assessment score (GAS). The scenarios were (1) unstable ventricular tachycardia, (2) respiratory failure, and (3) ST elevation myocardial infarction. Emergency medicine residents were videotaped completing the OSCE, and three clinician experts independently evaluated the videotapes using the assessment tool. RESULTS: Twenty-one residents completed the OSCE (nine residents in the College of Family Physicians of Canada-Emergency Medicine [CCFP-EM] program, six junior residents in the Fellow of the Royal College of Physicians of Canada-Emergency Medicine [FRCP-EM] program, six senior residents in the FRCP-EM). Interrater reliability for the EA scores was good but varied between scenarios (Spearman rho = [1] 0.68, [2] 0.81, [3] 0.41). Interrater reliability for the GAS was also good, with less variability (rho = [1] 0.64, [2] 0.56, [3] 0.62). When comparing GAS scores, senior FRCP residents outperformed CCFP-EM residents in all scenarios and junior residents in two of three scenarios (p < 0.001 to 0.01). Based on EA scores, senior FRCP residents outperformed CCFP-EM residents, but junior residents outperformed senior FRCP residents in scenario 1 and CCFP-EM residents in all scenarios (p = 0.006 to 0.04). CONCLUSION: This study outlines the creation of a high-fidelity simulation assessment tool for trainees in emergency medicine. A single-point GAS demonstrated stronger relational validity and more consistent reliability in comparison with an EA checklist. This preliminary work will provide a foundation for ongoing future development of simulation-based assessment tools.
Authors: Karen A Mangold; Justin M Jeffers; Rebekah A Burns; Jennifer L Trainor; Sharon M Unti; Walter Eppich; Mark D Adler Journal: J Grad Med Educ Date: 2015-09
Authors: J Damon Dagnone; Andrew K Hall; Stefanie Sebok-Syer; Don Klinger; Karen Woolfrey; Colleen Davison; John Ross; Gordon McNeil; Sean Moore Journal: Can Med Educ J Date: 2016-03-31
Authors: Laura R Thompson; Cynthia G Leung; Brad Green; Jonathan Lipps; Troy Schaffernocker; Cynthia Ledford; John Davis; David P Way; Nicholas E Kman Journal: West J Emerg Med Date: 2016-12-05
Authors: Belinda K Judd; Justin N Scanlan; Jennifer A Alison; Donna Waters; Christopher J Gordon Journal: BMC Med Educ Date: 2016-08-05 Impact factor: 2.463
Authors: Michael Jong; Nicole Elliott; Michael Nguyen; Terrence Goyke; Steven Johnson; Matthew Cook; Lisa Lindauer; Katie Best; Douglas Gernerd; Louis Morolla; Zachary Matuzsan; Bryan Kane Journal: West J Emerg Med Date: 2018-12-17