Jonathan P Wanderer1, Thomas A Lasko2, Joseph R Coco2, Leslie C Fowler3, Matthew D McEvoy3, Xiaoke Feng4, Matthew S Shotwell5, Gen Li3, Brian J Gelfand3, Laurie L Novak2, David A Owens6, Daniel V Fabbri7. 1. Department of Anesthesiology, Department of Biomedical Informatics, Vanderbilt University Medical Center, United States. Electronic address: jon.wanderer@vumc.org. 2. Department of Biomedical Informatics, Vanderbilt University Medical Center, United States. 3. Department of Anesthesiology, Vanderbilt University Medical Center, United States. 4. Department of Biostatistics, Vanderbilt University Medical Center, United States. 5. Department of Biostatistics, Department of Anesthesiology, Vanderbilt University Medical Center, United States. 6. Owen Graduate School of Management, Vanderbilt University, United States. 7. Department of Biomedical Informatics, Department of Computer Science, Vanderbilt University Medical Center, United States.
Abstract
STUDY OBJECTIVE: A challenge in reducing unwanted care variation is effectively managing the wide variety of performed surgical procedures. While an organization may perform thousands of types of cases, privacy and logistical constraints prevent review of previous cases to learn about prior practices. To bridge this gap, we developed a system for extracting key data from anesthesia records. Our objective was to determine whether usage of the system would improve case planning performance for anesthesia residents. DESIGN: Randomized, cross-over trial. SETTING: Vanderbilt University Medical Center. MEASUREMENTS: We developed a web-based, data visualization tool for reviewing de-identified anesthesia records. First year anesthesia residents were recruited and performed simulated case planning tasks (e.g., selecting an anesthetic type) across six case scenarios using a randomized, cross-over design after a baseline assessment. An algorithm scored case planning performance based on care components selected by residents occurring frequently among prior anesthetics, which was scored on a 0-4 point scale. Linear mixed effects regression quantified the tool effect on the average performance score, adjusting for potential confounders. MAIN RESULTS: We analyzed 516 survey questionnaires from 19 residents. The mean performance score was 2.55 ± SD 0.32. Utilization of the tool was associated with an average score improvement of 0.120 points (95% CI 0.060 to 0.179; p < 0.001). Additionally, a 0.055 point improvement due to the "learning effect" was observed from each assessment to the next (95% CI 0.034 to 0.077; p < 0.001). Assessment score was also significantly associated with specific case scenarios (p < 0.001). CONCLUSIONS: This study demonstrated the feasibility of developing of a clinical data visualization system that aggregated key anesthetic information and found that the usage of tools modestly improved residents' performance in simulated case planning.
STUDY OBJECTIVE: A challenge in reducing unwanted care variation is effectively managing the wide variety of performed surgical procedures. While an organization may perform thousands of types of cases, privacy and logistical constraints prevent review of previous cases to learn about prior practices. To bridge this gap, we developed a system for extracting key data from anesthesia records. Our objective was to determine whether usage of the system would improve case planning performance for anesthesia residents. DESIGN: Randomized, cross-over trial. SETTING: Vanderbilt University Medical Center. MEASUREMENTS: We developed a web-based, data visualization tool for reviewing de-identified anesthesia records. First year anesthesia residents were recruited and performed simulated case planning tasks (e.g., selecting an anesthetic type) across six case scenarios using a randomized, cross-over design after a baseline assessment. An algorithm scored case planning performance based on care components selected by residents occurring frequently among prior anesthetics, which was scored on a 0-4 point scale. Linear mixed effects regression quantified the tool effect on the average performance score, adjusting for potential confounders. MAIN RESULTS: We analyzed 516 survey questionnaires from 19 residents. The mean performance score was 2.55 ± SD 0.32. Utilization of the tool was associated with an average score improvement of 0.120 points (95% CI 0.060 to 0.179; p < 0.001). Additionally, a 0.055 point improvement due to the "learning effect" was observed from each assessment to the next (95% CI 0.034 to 0.077; p < 0.001). Assessment score was also significantly associated with specific case scenarios (p < 0.001). CONCLUSIONS: This study demonstrated the feasibility of developing of a clinical data visualization system that aggregated key anesthetic information and found that the usage of tools modestly improved residents' performance in simulated case planning.
Authors: Paul A Harris; Robert Taylor; Robert Thielke; Jonathon Payne; Nathaniel Gonzalez; Jose G Conde Journal: J Biomed Inform Date: 2008-09-30 Impact factor: 6.317
Authors: Michael Conrad Grant; Melinda M Gibbons; Clifford Y Ko; Elizabeth C Wick; Maxime Cannesson; Michael J Scott; Christopher L Wu Journal: Reg Anesth Pain Med Date: 2019-02-07 Impact factor: 6.288
Authors: Anaar Siletz; Christopher P Childers; Claire Faltermeier; Emily S Singer; Q Lina Hu; Clifford Y Ko; Stephen L Kates; Melinda Maggard-Gibbons; Elizabeth Wick Journal: Geriatr Orthop Surg Rehabil Date: 2018-05-20
Authors: Conrad W Safranek; Lauren Feitzinger; Alice Kate Cummings Joyner; Nicole Woo; Virgil Smith; Elizabeth De Souza; Christos Vasilakis; Thomas Anthony Anderson; James Fehr; Andrew Y Shin; David Scheinker; Ellen Wang; James Xie Journal: Appl Clin Inform Date: 2022-03-23 Impact factor: 2.342