William L Scheving1, Joseph M Ebersole2, Michael Froehler3, Donald Moore4, Kiersten Brown-Espaillat5, James Closser6, Wesley H Self7, Michael J Ward8. 1. Vanderbilt University School of Medicine, Nashville, TN, United States of America. Electronic address: william.l.scheving@vanderbilt.edu. 2. Vanderbilt University School of Medicine, Nashville, TN, United States of America. Electronic address: joseph.m.ebersole@vanderbilt.edu. 3. Vanderbilt University Medical Center, Department of Neurology, Nashville, TN, United States of America. Electronic address: m.froehler@vumc.org. 4. Vanderbilt University Medical Center, Office of Health Sciences Education, Nashville, TN, United States of America. Electronic address: don.moore@vumc.org. 5. Vanderbilt University Medical Center, Department of Neurology, United States of America. Electronic address: kiersten.brown@vumc.org. 6. Vanderbilt University Medical Center, Department of Neurology, United States of America. Electronic address: james.b.closser@vumc.org. 7. Vanderbilt University Medical Center, Department of Emergency Medicine, United States of America. Electronic address: wesley.self@vumc.org. 8. Vanderbilt University Medical Center, Department of Emergency Medicine, United States of America. Electronic address: michael.j.ward@vumc.org.
Abstract
INTRODUCTION: Emergency department (ED) providers and clinicians find that feedback on acute stroke patients is rewarding, valuable to professional development, and helpful for practice improvement. However, feedback is rarely provided, particularly for patients with stroke. Here we describe the implementation of an electronic stroke outcome reporting tool for providing feedback to ED providers. METHODS: We sought to evaluate the implementation of an electronic stroke outcome reporting tool at 3 Nashville hospitals. ED staff and providers voluntarily enrolled to receive de-identified reports of clinical (e.g., survival) and operational (e.g., timeliness) outcomes of patients with acute ischemic stroke and were offered free continuing education (CE) credits for following up on patients. We evaluated the implementation of this system through a descriptive evaluation of the feasibility, use of the system and CE, and perceived usefulness of the reports. RESULTS: We enrolled 232 ED providers, including 107 (46%) nurses and 57 (25%) attending physicians and transmitted 55 stroke outcome reports. Reports took 30-60 min to compile and were viewed by a mean of 2.6 (SD 1.5) registered providers; 97.1% found the reports useful and 36.2% reported likelihood to change practice. Continuing education credits were initiated or claimed by 22 providers. CONCLUSIONS: An electronic stroke outcome reporting tool was used and liked by ED staff and providers but the time to compile the reports is the major challenge to scalability. Future research should address the effectiveness of this reporting tool as a source of provider education and its impact on clinical and operational outcomes. Published by Elsevier Inc.
INTRODUCTION: Emergency department (ED) providers and clinicians find that feedback on acute strokepatients is rewarding, valuable to professional development, and helpful for practice improvement. However, feedback is rarely provided, particularly for patients with stroke. Here we describe the implementation of an electronic stroke outcome reporting tool for providing feedback to ED providers. METHODS: We sought to evaluate the implementation of an electronic stroke outcome reporting tool at 3 Nashville hospitals. ED staff and providers voluntarily enrolled to receive de-identified reports of clinical (e.g., survival) and operational (e.g., timeliness) outcomes of patients with acute ischemic stroke and were offered free continuing education (CE) credits for following up on patients. We evaluated the implementation of this system through a descriptive evaluation of the feasibility, use of the system and CE, and perceived usefulness of the reports. RESULTS: We enrolled 232 ED providers, including 107 (46%) nurses and 57 (25%) attending physicians and transmitted 55 stroke outcome reports. Reports took 30-60 min to compile and were viewed by a mean of 2.6 (SD 1.5) registered providers; 97.1% found the reports useful and 36.2% reported likelihood to change practice. Continuing education credits were initiated or claimed by 22 providers. CONCLUSIONS: An electronic stroke outcome reporting tool was used and liked by ED staff and providers but the time to compile the reports is the major challenge to scalability. Future research should address the effectiveness of this reporting tool as a source of provider education and its impact on clinical and operational outcomes. Published by Elsevier Inc.
Authors: Noah Ivers; Gro Jamtvedt; Signe Flottorp; Jane M Young; Jan Odgaard-Jensen; Simon D French; Mary Ann O'Brien; Marit Johansen; Jeremy Grimshaw; Andrew D Oxman Journal: Cochrane Database Syst Rev Date: 2012-06-13
Authors: R Le Grand Rogers; Yizza Narvaez; Arjun K Venkatesh; William Fleischman; M Kennedy Hall; R Andrew Taylor; Denise Hersey; Lynn Sette; Edward R Melnick Journal: Am J Emerg Med Date: 2015-07-23 Impact factor: 2.469
Authors: Paul A Harris; Robert Taylor; Robert Thielke; Jonathon Payne; Nathaniel Gonzalez; Jose G Conde Journal: J Biomed Inform Date: 2008-09-30 Impact factor: 6.317
Authors: Gregg C Fonarow; Xin Zhao; Eric E Smith; Jeffrey L Saver; Mathew J Reeves; Deepak L Bhatt; Ying Xian; Adrian F Hernandez; Eric D Peterson; Lee H Schwamm Journal: JAMA Date: 2014 Apr 23-30 Impact factor: 56.272
Authors: Ilana M Ruff; Syed F Ali; Joshua N Goldstein; Michael Lev; William A Copen; Joyce McIntyre; Natalia S Rost; Lee H Schwamm Journal: Stroke Date: 2014-01-07 Impact factor: 7.914