Literature DB >> 10536636

OSCE checklists do not capture increasing levels of expertise.

B Hodges1, G Regehr, N McNaughton, R Tiberius, M Hanson.   

Abstract

PURPOSE: To evaluate the effectiveness of binary content checklists in measuring increasing levels of clinical competence.
METHOD: Fourteen clinical clerks, 14 family practice residents, and 14 family physicians participated in two 15-minute standardized patient interviews. An examiner rated each participant's performance using a binary content checklist and a global process rating. The participants provided a diagnosis two minutes into and at the end of the interview.
RESULTS: On global scales, the experienced clinicians scored significantly better than did the residents and clerks, but on checklists, the experienced clinicians scored significantly worse than did the residents and clerks. Diagnostic accuracy increased for all groups between the two-minute and 15-minute marks without significant differences between the groups.
CONCLUSION: These findings are consistent with the hypothesis that binary checklists may not be valid measures of increasing clinical competence.

Entities:  

Mesh:

Year:  1999        PMID: 10536636     DOI: 10.1097/00001888-199910000-00017

Source DB:  PubMed          Journal:  Acad Med        ISSN: 1040-2446            Impact factor:   6.893


  53 in total

1.  Contextualizing SEGUE: Evaluating Residents' Communication Skills Within the Framework of a Structured Medical Interview.

Authors:  Jared Lyon Skillings; John H Porcerelli; Tsveti Markova
Journal:  J Grad Med Educ       Date:  2010-03

Review 2.  Evidence based checklists for objective structured clinical examinations.

Authors:  Christopher Frank
Journal:  BMJ       Date:  2006-09-09

3.  Problems with using a supervisor's report as a form of summative assessment.

Authors:  Tim J Wilkinson; Winnie B Wade
Journal:  Postgrad Med J       Date:  2007-07       Impact factor: 2.401

4.  Educational testing and validity of conclusions in the scholarship of teaching and learning.

Authors:  Michael J Peeters; Svetlana A Beltyukova; Beth A Martin
Journal:  Am J Pharm Educ       Date:  2013-11-12       Impact factor: 2.047

Review 5.  Development and content validation of performance assessments for endoscopic third ventriculostomy.

Authors:  Gerben E Breimer; Faizal A Haji; Eelco W Hoving; James M Drake
Journal:  Childs Nerv Syst       Date:  2015-05-01       Impact factor: 1.475

Review 6.  Simulation for competency assessment in vascular and cardiac ultrasound.

Authors:  Florence H Sheehan; R Eugene Zierler
Journal:  Vasc Med       Date:  2018-02-07       Impact factor: 3.239

7.  A standardized rubric to evaluate student presentations.

Authors:  Michael J Peeters; Eric G Sahloff; Gregory E Stone
Journal:  Am J Pharm Educ       Date:  2010-11-10       Impact factor: 2.047

8.  Getting to the right question.

Authors:  Todd Cassese; Elizabeth Kaplan; Vanja Douglas; Gurpreet Dhaliwal
Journal:  J Gen Intern Med       Date:  2012-11-27       Impact factor: 5.128

9.  The Postoperative Pain Assessment Skills pilot trial.

Authors:  Michael McGillion; Adam Dubrowski; Robyn Stremler; Judy Watt-Watson; Fiona Campbell; Colin McCartney; Charles Victor; Jeffrey Wiseman; Linda Snell; Judy Costello; Anja Robb; Sioban Nelson; Jennifer Stinson; Judith Hunter; Thuan Dao; Sara Promislow; Nancy McNaughton; Scott White; Cindy Shobbrook; Lianne Jeffs; Kianda Mauch; Marit Leegaard; W Scott Beattie; Martin Schreiber; Ivan Silver
Journal:  Pain Res Manag       Date:  2011 Nov-Dec       Impact factor: 3.037

10.  Teaching basic fiberoptic intubation skills in a simulator: initial learning and skills decay.

Authors:  Rana K Latif; Alexander Bautista; Xinyuan Duan; Aurel Neamtu; Dongfeng Wu; Anupama Wadhwa; Ozan Akça
Journal:  J Anesth       Date:  2015-10-22       Impact factor: 2.078

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.