Nicholas D Hartman1, Natasha B Wheaton2, Kelly Williamson3, Erin N Quattromani4, Jeremy B Branzetti5, Amer Z Aldeen6. 1. Department of Emergency Medicine, Wake Forest School of Medicine, Winston-Salem, North Carolina. 2. Department of Emergency Medicine, University of Iowa Carver College of Medicine, Iowa City, Iowa. 3. Department of Emergency Medicine, Advocate Christ Medical Center, Chicago, Illinois. 4. Division of Emergency Medicine, St. Louis University School of Medicine, St. Louis, Missouri. 5. Division of Emergency Medicine, University of Washington School of Medicine, Seattle, Washington. 6. Emergency Medicine Physicians, Ltd, Department of Emergency Medicine, Presence St. Joseph Medical Center, Joliet, Illinois.
Abstract
BACKGROUND: Reading emergent electrocardiograms (ECGs) is one of the emergency physician's most crucial tasks, yet no well-validated tool exists to measure resident competence in this skill. OBJECTIVES: To assess validity of a novel tool measuring emergency medicine resident competency for interpreting, and responding to, critical ECGs. In addition, we aim to observe trends in this skill for resident physicians at different levels of training. METHODS: This is a multi-center, prospective study of postgraduate year (PGY) 1-4 residents at five emergency medicine (EM) residency programs in the United States. An assessment tool was created that asks the physician to identify either the ECG diagnosis or the best immediate management. RESULTS: One hundred thirteen EM residents from five EM residency programs submitted completed assessment surveys, including 43 PGY-1s, 33 PGY-2s, and 37 PGY-3/4s. PGY-3/4s averaged 74.6% correct (95% confidence interval [CI] 70.9-78.4) and performed significantly better than PGY-1s, who averaged 63.2% correct (95% CI 58.0-68.3). PGY-2s averaged 69.0% (95% CI 62.2-73.7). Year-to-year differences were more pronounced in management than in diagnosis. CONCLUSIONS: Residency training in EM seems to be associated with improved ability to interpret "critical" ECGs as measured by our assessment tool. This lends validity evidence for the tool by correlating with a previously observed association between residency training and improved ECG interpretation. Resident skill in ECG interpretation remains less than ideal. Creation of this sort of tool may allow programs to assess resident performance as well as evaluate interventions designed to improve competency. Copyright Â
BACKGROUND: Reading emergent electrocardiograms (ECGs) is one of the emergency physician's most crucial tasks, yet no well-validated tool exists to measure resident competence in this skill. OBJECTIVES: To assess validity of a novel tool measuring emergency medicine resident competency for interpreting, and responding to, critical ECGs. In addition, we aim to observe trends in this skill for resident physicians at different levels of training. METHODS: This is a multi-center, prospective study of postgraduate year (PGY) 1-4 residents at five emergency medicine (EM) residency programs in the United States. An assessment tool was created that asks the physician to identify either the ECG diagnosis or the best immediate management. RESULTS: One hundred thirteen EM residents from five EM residency programs submitted completed assessment surveys, including 43 PGY-1s, 33 PGY-2s, and 37 PGY-3/4s. PGY-3/4s averaged 74.6% correct (95% confidence interval [CI] 70.9-78.4) and performed significantly better than PGY-1s, who averaged 63.2% correct (95% CI 58.0-68.3). PGY-2s averaged 69.0% (95% CI 62.2-73.7). Year-to-year differences were more pronounced in management than in diagnosis. CONCLUSIONS: Residency training in EM seems to be associated with improved ability to interpret "critical" ECGs as measured by our assessment tool. This lends validity evidence for the tool by correlating with a previously observed association between residency training and improved ECG interpretation. Resident skill in ECG interpretation remains less than ideal. Creation of this sort of tool may allow programs to assess resident performance as well as evaluate interventions designed to improve competency. Copyright Â
Authors: Nicole M Dubosh; Jaime Jordan; Lalena M Yarris; Edward Ullman; Joshua Kornegay; Daniel Runde; Amy Miller Juve; Jonathan Fisher Journal: AEM Educ Train Date: 2018-12-14
Authors: William P Burns; Nicholas D Hartman; P Logan Weygandt; Shanna C Jones; Holly Caretta-Weyer; Kristen Grabow Moore Journal: West J Emerg Med Date: 2019-12-18