| Literature DB >> 35388447 |
Han Chang Lim1,2, Jodie A Austin1,2, Anton H van der Vegt3, Amir Kamel Rahimi1,4, Oliver J Canfell1,4,5, Jayden Mifsud1, Jason D Pole1, Michael A Barras6,7, Tobias Hodgson5, Sally Shrapnel1,8, Clair M Sullivan1,9.
Abstract
OBJECTIVE: A learning health care system (LHS) uses routinely collected data to continuously monitor and improve health care outcomes. Little is reported on the challenges and methods used to implement the analytics underpinning an LHS. Our aim was to systematically review the literature for reports of real-time clinical analytics implementation in digital hospitals and to use these findings to synthesize a conceptual framework for LHS implementation.Entities:
Mesh:
Year: 2022 PMID: 35388447 PMCID: PMC8986462 DOI: 10.1055/s-0042-1743243
Source DB: PubMed Journal: Appl Clin Inform ISSN: 1869-0327 Impact factor: 2.762
Fig. 1Three horizons framework for digital health transformation. 6
| RQ-1. What challenges to clinical dashboard implementation are commonly identified? | |
| RQ-2. What successful methods have been used by health care organizations to overcome these challenges? | |
| RQ-3. How has clinical dashboard implementation been assessed and how effective has their implementation been for health care organizations? |
Study characteristics of included articles
| Study (year) and Country | Study design | Participants and sample size | Duration | Target user and intervention/s | Outcome measures (1) and Implementation (2) |
|---|---|---|---|---|---|
|
Mlaver et al (2017)
| CSS | Tertiary academic hospital | Pre = 16 months | The rounding team: | 1. Health–information technology usability evaluation scale (ITUES) survey. |
|
Fletcher et al (2017)
| RMDS | Academic medical center | Pre = 2 months | Rapid response team (RRT): provided the visibility at a glance to timely and accurate critical patient safety indicator information for multiple patients. | 1. Incidence ratio of all RRT activations. Measured the reduction for unexpected ICU transfers, unexpected cardiopulmonary arrests and unexpected deaths. |
|
Cox et al (2017)
| RCS | Tertiary academic hospital | Pre = 6 months | Heart failure providers: the heart failure dashboard provided the list of patients with heart failure diseases and described their clinical profiles using a color-coded system. | 1. Automatically identify the heart failure admissions and assess the characteristics of the disease and medical therapy in real time. |
|
Franklin et al (2017)
| CSS | Training and academic hospitals, community hospitals, private hospitals | Pre = 400 hours (at least 75 hours of observation per facility) | Clinicians, medical directors, ED directors, charge nurses: the dashboard visualizations increased situation awareness and provided a snapshot of the department and individual stages of care in real time. | 1. Anecdotal evidence. |
|
Ye et al (2019)
| RPCS |
Acute care two Berkshire health
| Pre = 2 years | Clinicians: the EWS system provided real-time alert/notification when the patient's situation met with the predefined predict threshold and risk scores. | 1. Evaluated the machine learning algorithms by identifying high-risk patients; and alerting staff for patients with high risk of mortality. |
|
Yoo et al (2018)
| CSS | Tertiary teaching hospital | Pre = 5 years | Physicians, nurses: the dashboard has visualized the geographical layout of the department and patient location; patient-level alert for workflow prioritization; and provided real-time summary data about ED performance/state. | 1. Survey questionnaire include: |
|
Schall et al (2017)
| CSS | Medical center | N/A | Nurse and physician: provided the visibility at a glance to timely and accurate critical patient safety indicator information. | 1. The dashboard reduced errors rates on task-based evaluation as it avoids visual “clutter” compared with conventional EHR displays. |
|
Fuller et al (2020)
| CSS | Academic medical center | Post =12 months | Physicians, physician assistants: the dashboard provided direct access within the EHR and obtained information about opioid management via real time and displayed into the dashboard via color-coding. The dashboard alerted clinicians about pain management issues and patient risks. | 1. Study task usability evaluation using standardized sheet to gather tasks from EHR and dashboard separately, and also record audio comments and computer screen activity using Morae. |
|
Merkel et al (2020)
| CS | Acute care, and critical care: statewide | Pre =19 days | Emergency operations committees (EOCs) command center operator: allowed each individual health system to track hospital resources in real near time. | 1. No outcome measure was reported. |
|
Bersani et al (2020)
| SW | Academic, acute-care hospital | Post (random cohort) = 18 months | Prescribers, nurses, patients, caregivers: dashboard accessed directly via EHR, displaying consolidated EHR information via color coding, allowing critical patient safety indicator information for multiple patients. | 1. Dashboard usage (number of logins) and usability (Health-ITUES) |
|
Ibrahim et al (2020)
| CS | Tertiary academic hospital | Pre = 30 days | The rounding team | 1. Percentage of patients requiring urgent intubation or cardiac resuscitation on general medical ward |
|
Kurtzman et al (2017)
| MM | University owned teaching hospital | Post = 6 months | Internal medicine residents/trainees: dashboard visualizations increased resident-specific rates of routine laboratory orders in real time. | 1. Dashboard utilization using e-mail read-receipts and web-based tracking |
| Paulson et al | CSS | Twenty-one hospital sites (3,922 inpatient beds) | Unclear | RRT, palliative care teams, | 1. Number of alerts triggered and percentage activating a call from VQT RN to RRT RN. |
|
Staib et al (2017)
| CS | Tertiary hospital | N/A | Physicians, nurses: ED-inpatient interface (Edii) dashboard to manage patient transfers from ED to inpatient hospital services. | 1. ED length of stay and mortality rates |
Abbreviations: adm (s), admission (s); COVID-19, novel coronavirus disease 2019; CS, case study; CSS, cross-sectional study; ED, emergency department; EHR, electronic health record; EWS, early warning system; ICU, intensive care unit; MM, mixed methods; N/A, not available; PA (s), participant (s); PC, palliative care; PCS, prospective cohort study; PSLL, Patient Safety Learning Laboratory; RCS, retrospective cohort study; RMDS, repeated measures design study; RPCS, retrospective and prospective cohort study; SW, stepped wedge study.
Fig. 2Framework to identify the different evaluation approaches (metrics, methods and contexts) for each level of implementation outcome (technical, clinical and patient outcomes). Health-ITUES, Health–Information Technology Usability Evaluation Scale; NASA-TLX, National Aeronautics and Space Administration-Task Load Index; PSSUQ, poststudy system usability questionnaire; ROC, receiver operating characteristics; SAI, situational awareness index; SUS, system usability scale; TAM, technology acceptance model; UTAUT, unified theory of acceptance and use of technology; WebQuaI, web site quality instrument.
Comparison of the inclusion criteria between the current review and prior review studies
| Meta study | Care setting | Data timeliness | Data source | Dashboard type | Implementation state |
|---|---|---|---|---|---|
|
Wilbanks and Langford
| Acute | Any | EHR | Any | Any |
|
Dowding et al
| Any | Any | Any | Clinical and quality | Implemented |
|
West et al
| Any | Any | EHR | Visualization | Any |
|
Maktoobi and Melchiori et al
| Any | Any | Any | Clinical | Any |
|
Buttigieg et al
| Acute | Any | Any | Performance | Any |
|
Khairat et al
| Any | Any | Any | Visualization | Any |
|
Auliya et al
| Any | Any | Any | Any | Any |
| Our study | Acute | Real time | EHR | Clinical | Implemented |
Abbreviation: EHR, electronic health record.
Inclusion criteria for the present review
| Inclusion criteria | |
|---|---|
| Population | •Adult population (≥18 years of age) |
| Intervention of interest | •Implementation of a real-time/near-real-time analytics product based upon aggregated data within a hospital using an EHR. Excludes single-patient-view-only dashboards |
| Study design | •All study designs |
| Publication date | •March 2015–March 2021 |
| Language | •English |
Abbreviation: EHR, electronic health record.
Consolidated view of RQ-1: challenges to implementation
| RQ-1 Consolidation of challenges | Review source | ||
|---|---|---|---|
| Horizon 2: digital dashboard delivery | Prior | Current | |
| People | C1: Training & training time | [7,8] | [7] |
| C2: Resourcing arrangements | [12] | [8] | |
| Process | C3: Financial and resource costs | [14] | [8] |
| C4: Organizational culture | [16] | ||
| C5: Lack of clinical guidelines/benchmarks | [17] | ||
| C6: Changing implementation environment | [15,28,29] | ||
| C7: Implementation time constraints | [17] | ||
| C8: Difficult to assess | [35] | ||
| Information | C9: Quantity of data | [18,19] | |
| C10: Complexity of data | [19] | ||
| C11: Uncertainty of data | [20] | ||
| C12: Quality of data | [21] | ||
| C13: Missing required data | [36] | ||
| C14: Normalization/regularization of data | [22,25] | [20] | |
| C15: Additional manual data entry | [23] | ||
| C16: Lack of nomenclature standardization | [24] | ||
| C17: Need for bioinformatician to extensively code | [21] | ||
| Technology | C18: Getting and presenting temporal data | [26] | [25,26,2] |
| C19: Presenting so much data and different types | [27] | [6,23,27,32,34] | |
| C20: Linking dashboard to EHR data | [28] | ||
| C21: Making data real-time | [29,32] | [22,37] | |
| C22: Dashboard reliability/connectivity | [30] | ||
| C23: Integration of heterogeneous data | [31] | [20,24] | |
| C24: Sourcing patient outcome information | [33] | ||
| C25: Handling rare events/small data sets | [34] | ||
| C26: Clinicians having enough info on dashboard | [5] | ||
| C27: Tech teething problems turns off users | [13] | ||
| C28: Support diverse users/workflows/screens | [28,29,31,33] | ||
| C29: Support change to environment | [30] | ||
| Horizon 3: clinical model | |||
| People | C30: Negative impact of dashboard on clinician | [1-5] | [4] |
| C31: Negative impact of dashboard on patient | [1-5] | ||
| C32: Clinician resistance | [6,9] | [1,11] | |
| C33: Resource concerns | [12] | ||
| C34: Integrating clinician thinking with dashboard | [3] | ||
| C35: Lack of clinician time | [9] | ||
| C36: Understanding variability of data | [26] | ||
| Process | C37: Ethical concerns over data usage | [13] | |
| C38: Different needs in different clinical settings | [15] | [2,28] | |
| C39: Clinical responsibility/disagreement problems | [12,14,16] | ||
| C40: Patient rescue/alert trade off | [18] | ||
| C41: Earlier alert is good, but clinicians see no benefit | [19] | ||
Abbreviations: EHR, electronic health record; RQ, research question.
Note: Challenges derived from prior reviews and this review were grouped into challenge areas. The source raw challenges are identified within square brackets and relate to the numbered challenges listed in Supplementary Table S1 (prior work) and Supplementary S8 (this review).
Fig. 4Proposed real-time clinical dashboard implementation conceptual framework. EHR, electronic health record.
Consolidated view of RQ-2: methods to overcome implementation challenges
| Review source | |||
|---|---|---|---|
| Implementation facet | RQ-2: Summary of solutions methods | Prior | Current |
| Dashboard design method | M1: Human centered design | [1] | [32,38] |
| M2: (Interactive) prototyping | [3] | [1,15,16,21] | |
| M3: Multidisciplinary/panel design team | [7,24,58] | ||
| M4: Design for change and re-use | [58,63,36] | ||
| M5: Other design method | [14,37,59,61] | ||
| Implementation method | M6: Interdisciplinary implementation approach | [2] | [19,20,45,57,62] |
| M7: Pilot implementation | [6] | [5,48] | |
| M8: Stakeholder engagement | [43,44,46] | ||
| M9: Staged (iterative) release of dashboard | [17,48,58,60] | ||
| M10: Address workflow/cultural issues upfront | [49,51,64] | ||
| M11: Assess feedback/barriers early and rectify | [41] | ||
| M12: Design for early wins for users | [47] | ||
| M13: Usage feedback (competitive) reports | [42,58,59] | ||
| M14: Other specific implementation methods | [3,9,30,34,35,52,56,62] | ||
| M15: Evaluation methods | [13,16] | ||
| Other dashboard considerations | M16: Suggested dashboard content | [5-8] | [8,11,29] |
| M17: Suggested dashboard functionality | [12,26,28] | ||
| M18: Alert considerations | [2,4,39,53,54,56 | ||
| M19: Color considerations | [22,29,39] | ||
| M20: metrics considered | [25,27] | ||
| M21: Dashboard access | [31] | ||
| Training | M22: Live training (at-the-elbow) | [18,40] | |
| M23: Other training approach | [40,50] | ||
| Resources and costs | M24: Personnel considerations | [4,10,35,53] | |
| M25: Method to reduce cost | [4] | ||
| Technology | M26: Technology | [33] | |
Abbreviation: RQ, research question.
Note: Manually grouped into method areas from the prior research and this current review. The source raw methods are identified within square brackets and relate to the numbered methods listed in Supplementary Table S2 (prior work) and Supplementary S10 (this review).