| Literature DB >> 30985283 |
Safiya Richardson1, David Feldstein2, Thomas McGinn1, Linda S Park2, Sundas Khan1, Rachel Hess3, Paul D Smith2, Rebecca Grochow Mishuris4, Lauren McCullagh1, Devin Mann5.
Abstract
BACKGROUND: Potential of the electronic health records (EHR) and clinical decision support (CDS) systems to improve the practice of medicine has been tempered by poor design and the resulting burden they place on providers. CDS is rarely tested in the real clinical environment. As a result, many tools are hard to use, placing strain on providers and resulting in low adoption rates. The existing CDS usability literature relies primarily on expert opinion and provider feedback via survey. This is the first study to evaluate CDS usability and the provider-computer-patient interaction with complex CDS in the real clinical environment.Entities:
Keywords: clinical decision support; clinical prediction rules; health informatics; live usability; provider adoption; usability; usability testing; user experience; workflow
Year: 2019 PMID: 30985283 PMCID: PMC6487349 DOI: 10.2196/12471
Source DB: PubMed Journal: JMIR Hum Factors ISSN: 2292-9495
Figure 1Clinical decision support tool calculator.
Figure 2Clinical decision support tool automatic order set.
Live usability testing results.
| Coding category, example comments or actionsa | Summary and recommendation | |
| Patient: “Was it last year or the year before – didn’t I have to get a pneumonia shot?” | During every testing session, the provider was interrupted during their use of the CDSb tool by the need to refer to other sections of the chart. | |
| Provider: “Have you had a chest X-ray anytime recently?” | Recommendation: Complex CDS should be built for disrupted workflow, with easy and obvious re-entry points. | |
| During every testing session, the progress note served as the center point of the provider interaction with the electronic health record. | ||
| “It’s the first thing that comes up...but you have to get all that info from the patient first. So that’s what I mean by clunky.” [PCId] | —e | |
| At the start of visit, all providers navigate immediately to the progress note. Half of them spent more than 95% of the visit with this function open, and only 1 spent more than 40% of the visit time with it open. [QMf] | Recommendation: CDS tools that exist within the progress note may have higher adoption rates because it would be more likely they were present at the time of decision making. | |
| Provider: “So I read your chart; it says that you’ve been having symptoms as deer season?” | In half of the sessions, patient history challenged the validity of the clinical prediction rule used to calculate risk. | |
| Patient: “I actually called in and Dr. [name] gave me a prescription...” | — | |
| “Sometimes...something in your clinical encounter still says, 'get the X-ray or still treat,' you know, maybe you saw them before.” [PCI] | Recommendation: CDS tools should be as broadly applicable as possible with clear indications for use. | |
| Provider: “OK, so our little risk calculator here is recommending that we would swab you for strep throat, and I agree with that.” | In every session in which the tool was used to assess risk, the provider completed the calculator with the patient. | |
| Provider: “But your heart is beating kinda fast, you’ve had a fever last night...the recommendation would be to get a chest x-ray today.” | — | |
| “I like to be able to show it to patients. So that part of it I really – I like to have that support, and that extra backup for the decision that I want to make.” [PCI] | Recommendation: CDS tools should be designed to be viewed by the patient and provider simultaneously. | |
| Patient: “My brother’s living with me, he’s a vet...” | In every testing session, the providers toggled between addressing either the computer or the patient during the visit. | |
| Provider: “So basically to summarize: about 9 days ago is when you first got sick...” | — | |
| (Silence while physician types) | — | |
| Providers spent 0% to 3% of their visit time listening to the patient without simultaneously engaging with the computer. [QM] | Recommendation: Providers may find CDS tools easier to complete if they engage patients. | |
| Provider: “Hold on, I just need the laboratory to actually put in the results... my thing isn’t popping up for me to prescribe the antibiotics quite yet.” | Providers were able to complete the tool quickly; however, during half of the sessions, hard stops and fixed elements in the tool created barriers to usability. | |
| “The patient instructions have some hard stop, so I got frustrated with that, and then eventually deleted and typed my own patient instructions in.” [PCI] | — | |
| “Cause it’s short. If it were any longer, I’d probably get frustrated with it.” [PCI] | — | |
| Providers spent about 1 min of the visit time completing the CDS tool. [QM] | Recommendation: Tools that are short, customizable, and flexible to different workflows will have improved usability. | |
| In every session, providers did not use either the automatic order set or automatic documentation. | ||
| Provider: “So the antibiotic that I would pick for you is one called Azithromycin.” | — | |
| “It’s easier for me to order a chest X-ray just outside of the order set...then get the results back and go on with the patient visit. And then at that point, it’s like the opportunity has been lost to use the [automatic order] set.” [PCI] | Recommendation: Elements that are incorporated into CDS tools as incentives should save the provider time or effort when compared with their usual workflow. | |
aProvider and patient statements during the visit are included in quotations, and provider actions are in italics.
bCDS: clinical decision support.
cCXR: chest x-ray.
dPCI: provider comments during interview.
eThe Summary and Recommendation for each of the Coding Categories applies to all of the data provided.
fQM: quantitative measurements.
Figure 3Clinical decision support system proposed workflow.