Adam Tobias1, Robert Sobehart2, Ankur A Doshi1, Brian Suffoletto3. 1. Associate Professor of Emergency Medicine, University of Pittsburgh School of Medicine. 2. Medical Director, Department of Emergency Medicine, Allegheny Health Network. 3. Associate Professor, Department of Emergency Medicine, Stanford University School of Medicine.
Abstract
BACKGROUND: End-of-shift assessments (ESA) can provide representative data on medical trainee performance but do not occur routinely and are not documented systematically. OBJECTIVE: To evaluate the implementation of a web-based tool with text message prompts to assist mobile ESA (mESA) in an emergency medicine (EM) residency program. METHODS: mESA used timed text messages to prompt faculty/trainees to expect in-person qualitative ESA in a milestone content area and for the faculty to record descriptive performance data through a web-based platform. We assessed implementation between January 2018 and November 2019 using the RE-AIM framework (reach, effectiveness, adoption, implementation, and maintenance). RESULTS: Reach: 96 faculty and 79 trainees participated in the mESA program. Effectiveness: From surveys, approximately 72% of faculty and 58% of trainees reported increases in providing and receiving ESA feedback after program implementation. From ESA submissions, trainees reported receiving in-person feedback on 90% of shifts. Residency leadership confirmed perceived utility of the mESA program. Adoption: mESA prompts were sent on 7792 unique shifts across 4 EDs, all days of week, and different times of day. Faculty electronically submitted ESA feedback on 45% of shifts. Implementation quality: No technological errors occurred. Maintenance: Completion of in-person ESA feedback and electronic submission of feedback by faculty was stable over time. CONCLUSIONS: We found mixed evidence in support of using a web-based tool with text message prompts for mESA for EM trainees.
BACKGROUND: End-of-shift assessments (ESA) can provide representative data on medical trainee performance but do not occur routinely and are not documented systematically. OBJECTIVE: To evaluate the implementation of a web-based tool with text message prompts to assist mobile ESA (mESA) in an emergency medicine (EM) residency program. METHODS: mESA used timed text messages to prompt faculty/trainees to expect in-person qualitative ESA in a milestone content area and for the faculty to record descriptive performance data through a web-based platform. We assessed implementation between January 2018 and November 2019 using the RE-AIM framework (reach, effectiveness, adoption, implementation, and maintenance). RESULTS: Reach: 96 faculty and 79 trainees participated in the mESA program. Effectiveness: From surveys, approximately 72% of faculty and 58% of trainees reported increases in providing and receiving ESA feedback after program implementation. From ESA submissions, trainees reported receiving in-person feedback on 90% of shifts. Residency leadership confirmed perceived utility of the mESA program. Adoption: mESA prompts were sent on 7792 unique shifts across 4 EDs, all days of week, and different times of day. Faculty electronically submitted ESA feedback on 45% of shifts. Implementation quality: No technological errors occurred. Maintenance: Completion of in-person ESA feedback and electronic submission of feedback by faculty was stable over time. CONCLUSIONS: We found mixed evidence in support of using a web-based tool with text message prompts for mESA for EM trainees.