| Literature DB >> 35515723 |
Jeffrey A Wilhite1, Harriet Fisher1, Lisa Altshuler1, Elisabeth Cannell2, Khemraj Hardowar1, Kathleen Hanley1, Colleen Gillespie1, Sondra Zabar1.
Abstract
Objective structured clinical examinations (OSCEs) provide a controlled, simulated setting for competency assessments, while unannounced simulated patients (USPs) measure competency in situ or real-world settings. This exploratory study describes differences in primary care residents' skills when caring for the same simulated patient case in OSCEs versus in a USP encounter. Data reported describe a group of residents (n=20) who were assessed following interaction with the same simulated patient case in two distinct settings: an OSCE and a USP visit at our safety-net clinic from 2009 to 2010. In both scenarios, the simulated patient presented as an asthmatic woman with limited understanding of illness management. Residents were rated through a behaviourally anchored checklist on visit completion. Summary scores (mean % well done) were calculated by domain and compared using paired sample t-tests. Residents performed significantly better with USPs on 7 of 10 items and in two of three aggregate assessment domains (p<0.05). OSCE structure may impede assessment of activation and treatment planning skills, which are better assessed in real-world settings. This exploration of outcomes from our two assessments using the same clinical case lays a foundation for future research on variation in situated performance. Using both assessments during residency will provide a more thorough understanding of learner competency. © Author(s) (or their employer(s)) 2021. No commercial re-use. See rights and permissions. Published by BMJ.Entities:
Keywords: assessment; clinical skills practice; contextual; education; graduate medical education; simulation-based medical education
Year: 2020 PMID: 35515723 PMCID: PMC8936516 DOI: 10.1136/bmjstel-2020-000759
Source DB: PubMed Journal: BMJ Simul Technol Enhanc Learn ISSN: 2056-6697