Cheryl Bodamer1, Moshe Feldman, Jeffrey Kushinka, Ellen Brock, Alan Dow, Jessica A Evans, Gonzalo Bearman. 1. From the Center for Human Simulation and Patient Safety (CB,EB, MF), the Center for Interprofessional Education and Collaborative Care (AD), the Department of Internal Medicine (JK,GB) and Office of Assessment and Evaluation Studies (JE,MF) at Virginia Commonwealth University School of Medicine.
Abstract
INTRODUCTION: Achieving standardized assessment of medical student competency in patient care is a challenge. Simulation may provide unique contributions to overall assessment. We developed an Internal Medicine Standardized Simulation-Based Examination (SSBE) for the third-year clerkship to assess students' medical knowledge, diagnostic skills, and clinical management skills. We assessed convergent and test criterion validity by comparing the relationship of SSBE scores with United States Medical Licensing Examination step 2 clinical knowledge, shelf examination, eQuiz, objective structured clinical examination, ward evaluation scores, and overall clerkship grades. We hypothesize that the use of the SSBE will allow for a more reliable assessment of these competencies and add value to existing assessments. METHODS: A prospective study design was used. The SSBE consisted of a computer-based photo quiz and cases on high-fidelity simulators. Performance on the SSBE was compared with standardized examinations, clinical evaluations, and overall clerkship grades. Students completed an evaluation of the experience. RESULTS: Two hundred seven students completed the SSBE, with a mean (SD) score of 76.69 (7.78). The SSBE performance was positively related to other assessments of medical knowledge eQuiz scores (r203 = 0.33, P < 0.01), shelf examination scores (r158 = 0.53, P < 0.01), and clinical performance (ward scores) (r163 = 0.31, P < 0.01) but not to objective structured clinical examination scores. There was a positive relationship to final class grades (r163 = 0.45, P < 0.01), shelf examination (r158 = 0.52, P < 0.01) and step 2 clinical knowledge scores (r76 = 0.54, P < 0.01). Most students (93%) agreed that it was a fair examination. CONCLUSIONS: Our results provide validity evidence for the SSBE as an additional assessment tool that uses a novel approach for evaluating competency in patient care at the clerkship level.
INTRODUCTION: Achieving standardized assessment of medical student competency in patient care is a challenge. Simulation may provide unique contributions to overall assessment. We developed an Internal Medicine Standardized Simulation-Based Examination (SSBE) for the third-year clerkship to assess students' medical knowledge, diagnostic skills, and clinical management skills. We assessed convergent and test criterion validity by comparing the relationship of SSBE scores with United States Medical Licensing Examination step 2 clinical knowledge, shelf examination, eQuiz, objective structured clinical examination, ward evaluation scores, and overall clerkship grades. We hypothesize that the use of the SSBE will allow for a more reliable assessment of these competencies and add value to existing assessments. METHODS: A prospective study design was used. The SSBE consisted of a computer-based photo quiz and cases on high-fidelity simulators. Performance on the SSBE was compared with standardized examinations, clinical evaluations, and overall clerkship grades. Students completed an evaluation of the experience. RESULTS: Two hundred seven students completed the SSBE, with a mean (SD) score of 76.69 (7.78). The SSBE performance was positively related to other assessments of medical knowledge eQuiz scores (r203 = 0.33, P < 0.01), shelf examination scores (r158 = 0.53, P < 0.01), and clinical performance (ward scores) (r163 = 0.31, P < 0.01) but not to objective structured clinical examination scores. There was a positive relationship to final class grades (r163 = 0.45, P < 0.01), shelf examination (r158 = 0.52, P < 0.01) and step 2 clinical knowledge scores (r76 = 0.54, P < 0.01). Most students (93%) agreed that it was a fair examination. CONCLUSIONS: Our results provide validity evidence for the SSBE as an additional assessment tool that uses a novel approach for evaluating competency in patient care at the clerkship level.
Authors: Paul A Harris; Robert Taylor; Robert Thielke; Jonathon Payne; Nathaniel Gonzalez; Jose G Conde Journal: J Biomed Inform Date: 2008-09-30 Impact factor: 6.317