Julia K W Yarahuan1, Huay-Ying Lo2, Lanessa Bass2, Jeff Wright3, Lauren M Hess2. 1. Division of Pediatric Hospital Medicine, Department of Pediatrics, Boston Children's Hospital, Boston, Massachusetts, United States. 2. Section of Pediatric Hospital Medicine, Department of Pediatrics, Baylor College of Medicine/Texas Children's Hospital, Houston, Texas, United States. 3. Information Services, Texas Children's Hospital, Houston, Texas, United States.
Abstract
BACKGROUND AND OBJECTIVES: Pediatric residency programs are required by the Accreditation Council for Graduate Medical Education to provide residents with patient-care and quality metrics to facilitate self-identification of knowledge gaps to prioritize improvement efforts. Trainees are interested in receiving this data, but this is a largely unmet need. Our objectives were to (1) design and implement an automated dashboard providing individualized data to residents, and (2) examine the usability and acceptability of the dashboard among pediatric residents. METHODS: We developed a dashboard containing individualized patient-care data for pediatric residents with emphasis on needs identified by residents and residency leadership. To build the dashboard, we created a connection from a clinical data warehouse to data visualization software. We allocated patients to residents based on note authorship and created individualized reports with masked identities that preserved anonymity. After development, we conducted usability and acceptability testing with 11 resident users utilizing a mixed-methods approach. We conducted interviews and anonymous surveys which evaluated technical features of the application, ease of use, as well as users' attitudes toward using the dashboard. Categories and subcategories from usability interviews were identified using a content analysis approach. RESULTS: Our dashboard provides individualized metrics including diagnosis exposure counts, procedure counts, efficiency metrics, and quality metrics. In content analysis of the usability testing interviews, the most frequently mentioned use of the dashboard was to aid a resident's self-directed learning. Residents had few concerns about the dashboard overall. Surveyed residents found the dashboard easy to use and expressed intention to use the dashboard in the future. CONCLUSION: Automated dashboards may be a solution to the current challenge of providing trainees with individualized patient-care data. Our usability testing revealed that residents found our dashboard to be useful and that they intended to use this tool to facilitate development of self-directed learning plans. Thieme. All rights reserved.
BACKGROUND AND OBJECTIVES: Pediatric residency programs are required by the Accreditation Council for Graduate Medical Education to provide residents with patient-care and quality metrics to facilitate self-identification of knowledge gaps to prioritize improvement efforts. Trainees are interested in receiving this data, but this is a largely unmet need. Our objectives were to (1) design and implement an automated dashboard providing individualized data to residents, and (2) examine the usability and acceptability of the dashboard among pediatric residents. METHODS: We developed a dashboard containing individualized patient-care data for pediatric residents with emphasis on needs identified by residents and residency leadership. To build the dashboard, we created a connection from a clinical data warehouse to data visualization software. We allocated patients to residents based on note authorship and created individualized reports with masked identities that preserved anonymity. After development, we conducted usability and acceptability testing with 11 resident users utilizing a mixed-methods approach. We conducted interviews and anonymous surveys which evaluated technical features of the application, ease of use, as well as users' attitudes toward using the dashboard. Categories and subcategories from usability interviews were identified using a content analysis approach. RESULTS: Our dashboard provides individualized metrics including diagnosis exposure counts, procedure counts, efficiency metrics, and quality metrics. In content analysis of the usability testing interviews, the most frequently mentioned use of the dashboard was to aid a resident's self-directed learning. Residents had few concerns about the dashboard overall. Surveyed residents found the dashboard easy to use and expressed intention to use the dashboard in the future. CONCLUSION: Automated dashboards may be a solution to the current challenge of providing trainees with individualized patient-care data. Our usability testing revealed that residents found our dashboard to be useful and that they intended to use this tool to facilitate development of self-directed learning plans. Thieme. All rights reserved.
Authors: Alina Smirnova; Stefanie S Sebok-Syer; Saad Chahine; Adina L Kalet; Robyn Tamblyn; Kiki M J M H Lombarts; Cees P M van der Vleuten; Daniel J Schumacher Journal: Acad Med Date: 2019-05 Impact factor: 6.893
Authors: Marcella Nunez-Smith; Maria M Ciarleglio; Teresa Sandoval-Schaefer; Johanna Elumn; Laura Castillo-Page; Peter Peduzzi; Elizabeth H Bradley Journal: Am J Public Health Date: 2012-03-15 Impact factor: 9.308
Authors: Jane M Liebschutz; Godwin O Darko; Erin P Finley; Jeanne M Cawse; Monica Bharel; Jay D Orlander Journal: J Natl Med Assoc Date: 2006-09 Impact factor: 1.798
Authors: Daniel J Schumacher; Eric S Holmboe; Cees van der Vleuten; Jamiu O Busari; Carol Carraccio Journal: Acad Med Date: 2018-07 Impact factor: 6.893
Authors: JoAnna K Leyenaar; Shawn L Ralston; Meng-Shiou Shieh; Penelope S Pekow; Rita Mangione-Smith; Peter K Lindenauer Journal: J Hosp Med Date: 2016-07-04 Impact factor: 2.960
Authors: Zachary Burningham; Regina Richter Lagha; Brittany Duford-Hutchinson; Carol Callaway-Lane; Brian C Sauer; Ahmad S Halwani; Jamie Bell; Tina Huynh; Joseph R Douglas; B Josea Kramer Journal: Appl Clin Inform Date: 2022-10-12 Impact factor: 2.762