Literature DB >> 19907398

Setting defensible standards for cardiac auscultation skills in medical students.

Diane B Wayne1, John Butter, Elaine R Cohen, William C McGaghie.   

Abstract

BACKGROUND: Cardiac auscultation is a critical clinical skill for physicians, but minimum performance standards do not exist.
METHOD: One hundred third-year medical students from three schools completed a case-based computerized examination that assessed their ability to identify 12 major cardiac findings. Cohort performance was reviewed by a panel of expert judges who provided item-based (Angoff method) and group-based (Hofstee method) judgments on two occasions. Judges' ratings were used to calculate a minimum passing standard (MPS) for cardiac auscultation skills. Interrater reliabilities and test-retest reliability (stability) were calculated.
RESULTS: Both methods produced reliable and stable data. Use of the Angoff method yielded a more lenient MPS than the Hofstee method. Two thirds of the students (66%) did not achieve the MPS.
CONCLUSIONS: Use of a defensible standard allows for reliable evaluation of cardiac auscultation skills. Further work is needed to improve the performance of this important clinical skill by medical students.

Entities:  

Mesh:

Year:  2009        PMID: 19907398     DOI: 10.1097/ACM.0b013e3181b38e8c

Source DB:  PubMed          Journal:  Acad Med        ISSN: 1040-2446            Impact factor:   6.893


  7 in total

1.  Simulation-guided cardiac auscultation improves medical students' clinical skills: the Pavia pilot experience.

Authors:  Stefano Perlini; Francesco Salinaro; Paola Santalucia; Francesco Musca
Journal:  Intern Emerg Med       Date:  2012-07-06       Impact factor: 3.397

2.  Simulation-based mastery learning improves cardiac auscultation skills in medical students.

Authors:  John Butter; William C McGaghie; Elaine R Cohen; Marsha Kaye; Diane B Wayne
Journal:  J Gen Intern Med       Date:  2010-08       Impact factor: 5.128

3.  A tool to assess student performance in a Clostridium difficile infection simulation scenario.

Authors:  Brenda S Bray; Megan N Willson; Jennifer D Robinson; Gregory T Matsuura; Catrina R Schwartz; Douglas L Weeks
Journal:  Am J Pharm Educ       Date:  2013-09-12       Impact factor: 2.047

4.  Validity and reliability assessment of detailed scoring checklists for use during perioperative emergency simulation training.

Authors:  Matthew D McEvoy; William R Hand; Cory M Furse; Larry C Field; Carlee A Clark; Vivek K Moitra; Paul J Nietert; Michael F O'Connor; Mark E Nunnally
Journal:  Simul Healthc       Date:  2014-10       Impact factor: 1.929

5.  Automated near-real-time clinical performance feedback for anesthesiology residents: one piece of the milestones puzzle.

Authors:  Jesse M Ehrenfeld; Matthew D McEvoy; William R Furman; Dylan Snyder; Warren S Sandberg
Journal:  Anesthesiology       Date:  2014-01       Impact factor: 7.892

6.  Validation of a detailed scoring checklist for use during advanced cardiac life support certification.

Authors:  Matthew D McEvoy; Jeremy C Smalley; Paul J Nietert; Larry C Field; Cory M Furse; John W Blenko; Benjamin G Cobb; Jenna L Walters; Allen Pendarvis; Nishita S Dalal; John J Schaefer
Journal:  Simul Healthc       Date:  2012-08       Impact factor: 1.929

7.  Setting a Minimum Passing Standard for the Uncertainty Communication Checklist Through Patient and Physician Engagement.

Authors:  David H Salzman; Kristin L Rising; Kenzie A Cameron; Rhea E Powell; Dimitri Papanagnou; Amanda Doty; Katherine Piserchia; Lori Latimer; William C McGaghie; Danielle M McCarthy
Journal:  J Grad Med Educ       Date:  2020-02
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.