Literature DB >> 25342178

Electronic health records improve clinical note quality.

Harry B Burke1, Laura L Sessums1, Albert Hoang1, Dorothy A Becher1, Paul Fontelo2, Fang Liu2, Mark Stephens3, Louis N Pangaro1, Patrick G O'Malley1, Nancy S Baxi4, Christopher W Bunt3, Vincent F Capaldi4, Julie M Chen4, Barbara A Cooper4, David A Djuric5, Joshua A Hodge5, Shawn Kane4, Charles Magee1, Zizette R Makary4, Renee M Mallory4, Thomas Miller3, Adam Saperstein3, Jessica Servey3, Ronald W Gimbel6.   

Abstract

BACKGROUND AND
OBJECTIVE: The clinical note documents the clinician's information collection, problem assessment, clinical management, and its used for administrative purposes. Electronic health records (EHRs) are being implemented in clinical practices throughout the USA yet it is not known whether they improve the quality of clinical notes. The goal in this study was to determine if EHRs improve the quality of outpatient clinical notes.
MATERIALS AND METHODS: A five and a half year longitudinal retrospective multicenter quantitative study comparing the quality of handwritten and electronic outpatient clinical visit notes for 100 patients with type 2 diabetes at three time points: 6 months prior to the introduction of the EHR (before-EHR), 6 months after the introduction of the EHR (after-EHR), and 5 years after the introduction of the EHR (5-year-EHR). QNOTE, a validated quantitative instrument, was used to assess the quality of outpatient clinical notes. Its scores can range from a low of 0 to a high of 100. Sixteen primary care physicians with active practices used QNOTE to determine the quality of the 300 patient notes.
RESULTS: The before-EHR, after-EHR, and 5-year-EHR grand mean scores (SD) were 52.0 (18.4), 61.2 (16.3), and 80.4 (8.9), respectively, and the change in scores for before-EHR to after-EHR and before-EHR to 5-year-EHR were 18% (p<0.0001) and 55% (p<0.0001), respectively. All the element and grand mean quality scores significantly improved over the 5-year time interval.
CONCLUSIONS: The EHR significantly improved the overall quality of the outpatient clinical note and the quality of all its elements, including the core and non-core elements. To our knowledge, this is the first study to demonstrate that the EHR significantly improves the quality of clinical notes.
© The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association.

Entities:  

Keywords:  QNOTE; clinical note; clinical quality; electronic health record; note quality

Mesh:

Year:  2014        PMID: 25342178      PMCID: PMC4433367          DOI: 10.1136/amiajnl-2014-002726

Source DB:  PubMed          Journal:  J Am Med Inform Assoc        ISSN: 1067-5027            Impact factor:   4.497


INTRODUCTION

The clinical note1,2 documents the physician's information collection,3–7 problem assessment,3–7 and clinical management.3–8 In addition to its clinical uses, it is important for patient safety,5,6,9–11 quality assurance,5,12,13 legal proceedings,4,5,14 billing justification,4,6,14,15 and medical education.3,16–18 EHRs are being implemented in clinical practices throughout the USA yet the basic functions of clinical notes have not changed despite this transition from a paper to electronic format. The benefits of EHRs include the instantaneous availability of medical records6,9,16 and the elimination of illegible notes.6,16,19,20 It is not known whether EHRs improve the quality of clinical notes. We developed a quantitative instrument, QNOTE, to measure clinical note quality. A validation study found QNOTE to be a valid and reliable measure of clinical note quality.2 We used QNOTE to assess the quality of the outpatient primary care clinical notes for patients with type 2 diabetes who were seen in clinic at three successive time points: before EHR implementation (before-EHR), approximately 6 months after-EHR implementation (after-EHR), and approximately 5 years following EHR implementation (5-year-EHR). We hypothesized that the implementation of an EHR would improve the quality of outpatient clinical visit notes.

METHODS

This is a five and a half year longitudinal retrospective multicenter study. QNOTE is a validated quantitative instrument that assesses the quality of the clinical note in terms of 12 clinical elements: chief complaint, history of present illness (HPI), problem list, past medical history, medications, adverse drug reactions and allergies, social and family history, review of systems, physical findings, assessment, plan of care, and follow-up information. The seven components used to assess the elements are as follows: clear, complete, concise, current, organized, prioritized, and sufficient information. The external and internal validations of QNOTE have been previously described and it was shown that the format of the note did not affect the quality ratings.2 Briefly, the physician outpatient notes of patients with type 2 diabetes (patients could have comorbid conditions) who had been seen in clinic on at least three occasions were used for this study. These notes were also used for the prior instrument validation study. The three occasions were: once approximately 6 months prior to EHR adoption (before-EHR), once approximately 6 months after EHR adoption (after-EHR), and once approximately 5 years after EHR adoption (5-year-EHR). This resulted in three outpatient clinical visit notes per patient. One-third of the notes were handwritten (before-EHR) and two-thirds were electronic (after-EHR and 5-year-EHR). The before-EHR visit notes were free text and the after-EHR and 5-year-EHR visit notes were electronic templates structured by the physicians. QNOTE has been shown to be equally reliable for assessing handwritten and electronic notes.2 From the patient pool of 537 patients, 100 study patients were randomly selected, resulting in 300 study notes; 100 before-EHR, 100 after-EHR, and 100 5-year-EHR. To rate the visit notes, we recruited 8 general internal medicine and 8 family medicine MHS physicians in the District of Columbia metropolitan area, 10 were military physicians, 6 were civilian physicians, none of whom had prior experience assessing the quality of clinical notes. Before starting their reviews, the raters were shown the QNOTE instrument and instructed on how to fill it out online. They were told to read the note and score the note using QNOTE. They did not receive any training. The notes and raters were each independently randomized prior to the notes being given to the raters. They were asked to score the components of each element as fully acceptable, partially acceptable, or unacceptable. Not all of the components were used to evaluate all of the elements, but every element was evaluated using at least one component. After the raters completed their ratings of the elements’ components, a score was assigned to each component's rating: fully acceptable (100), partially acceptable (50), and unacceptable (0). The average of the component scores was the score for that element. The reviewers were not contacted during the review process and they were not compensated for participating in the study. The Uniformed Services University of the Health Sciences Institutional Review Board approved the study. QNOTE scores are reported as means, ranging from 0 to 100, and SDs. The element scores were compared with their grand mean scores using the Student's t-test. The change in the element scores and grand mean scores across the time intervals were calculated in percentages and compared using the Student's paired t-test. Pearson's correlation coefficient was used to compare elements over time. The F-test was used to test for equality of variance. All calculations were performed using SAS 9.7 (Cary, North Carolina, USA).

RESULTS

The before-EHR, after-EHR, and 5-year-EHR grand mean scores (SDs) were 52.0 (18.4), 61.2 (16.3), and 80.4 (8.9), respectively, (table 1) and the change in scores for before-EHR to after-EHR and before-EHR to 5-year-EHR were 18% (p<0.0001) and 55% (p<0.0001), respectively (table 2). All the element and grand mean quality scores significantly improved over the 5-year time interval. The improvement in scores was associated with an increasing similarity of element scores (figure 1). The correlation between the element scores, comparing before-EHR to after-EHR was r = 0.96, p<0.0001, and comparing before-EHR to 5-year-EHR was r = 0.61, p = 0.035. Over time, the element scores became less correlated with the before-EHR scores.
Table 1:

QNOTE mean element scores (SD) at each time point and each element's mean score compared with its grand mean score

ElementsBefore-EHRAfter-EHR5-years-EHR
Chief complaint62.5 (28.3)70.9 (26.9)78.2 (23.7)
History of present illness63.2 (26.1)60.5 (33.8)84.7 (18.3)
Problem list24.0 (28.9)*38.6 (31.6)*59.6 (31.2)*
Past medical history29.4 (36.4)*39.4 (35.1)*84.7 (24.0)
Medications59.0 (39.0)67.1 (32.3)90.8 (17.7)
Adverse drug reactions and allergies69.1 (33.7)77.3 (30.4)79.3 (29.4)
Social and family history25.4 (27.9)*35.4 (30.9)*72.7 (26.0)*
Review of systems30.7 (33.3)*48.1 (38.7)*80.4 (23.8)
Physical findings66.3 (27.6)70.3 (32.9)85.8 (16.5)
Assessment65.5 (24.4)75.8 (21.4)86.6 (14.5)
Plan of care65.4 (24.6)75.7 (21.7)85.3 (15.3)
Follow-up information63.5 (27.0)75.7 (22.3)81.7 (20.5)
Grand mean52.0 (18.4)61.2 (16.3)80.4 (8.9)

*Element score is significantly below the grand mean (p<0.05). †Element score is significantly above the grand mean (p<0.05).

EHR, electronic health record.

Table 2:

Comparison of percent change in QNOTE element scores over time (p values)

ElementsBefore vs afterBefore vs 5-years
Chief complaint13% (<0.05)25% (<0.0001)
History of present illness−4% (NS)34% (<0.0001)
Problem list61% (<0.002)144% (<0.0001)
Past medical history34% (NS)188% (<0.0001)
Medications14% (NS)54% (<0.0001)
Adverse drug reactions and allergies12% (NS)15% (<0.05)
Social and family history39% (<0.05)186% (<0.0001)
Review of systems57% (<0.0005)162% (<0.0001)
Physical findings6% (NS)29% (<0.0001)
Assessment16% (<0.002)32% (<0.0001)
Plan of care16% (<0.002)31% (<0.0001)
Follow-up information19% (<0.0005)29% (<0.0001)
Grand mean18% (<0.0001)55% (<0.0001)
Figure 1:

The shape of QNOTE element scores at the three time points.

QNOTE mean element scores (SD) at each time point and each element's mean score compared with its grand mean score *Element score is significantly below the grand mean (p<0.05). †Element score is significantly above the grand mean (p<0.05). EHR, electronic health record. Comparison of percent change in QNOTE element scores over time (p values) The shape of QNOTE element scores at the three time points. The significant shift to higher quality scores was accompanied by a significant reduction in the grand mean SDs, before-EHR to 5-year-EHR, 18.4–8.4, p = 0.015. Other than the problem list variance, which did not significantly change over time, all the elements’ variances decreased in the before-EHR to 5-year-EHR time interval. At the before-EHR time point, four elements, namely, problem list, past medical history, social and family history, and review of systems, were significantly below the before-EHR grand mean (table 1). These four elements’ scores demonstrated the largest improvements over time. Also at the before-EHR time point, four elements, namely, adverse drug reason and allergies, physical findings, assessment, and plan of care were significantly above the before-EHR grand mean. These elements’ scores also significantly improved, but less so than the four elements that had been below the grand mean.

DISCUSSION

The introduction of the EHR significantly improved the quality of the outpatient clinical notes. All the element and grand mean quality scores significantly improved, some within 6 months and all by the end of 5 years. This suggests that it took the physicians some time to learn how to effectively use the capabilities of the EHR and that its structural design caused them to write more detailed and comprehensive notes. The decline in the correlation of the elements before-EHR to after-EHR and to 5-year-EHR reflects changes in their relationships due to the introduction of the EHR. The reduction in element score variances by the end of 5 years demonstrates that the improvement in scores occurred across physicians, rather than in a subset of physicians. The ‘core’ elements that physicians focus on during the patient visit are the chief complaint, HPI, physical findings, assessment, plan of care, and follow-up. At the before-EHR time point all the core elements had quality scores greater than 60, which suggests that the physicians were focusing on them during the patient visit before the advent of the EHR. The core elements’ average score increased from 64.4 (before-EHR) to 83.7 (5-years-EHR), p<0.001; a significant 30% increase in quality. Because the core elements are not a result of auto population or other automated systems it is reasonable to conclude that the EHR's effect on physician performance is responsible for the observed 30% improvement in core note quality. Many EHRs include check boxes, presumably to make it easier for physicians to complete a detailed yet comprehensive note. But check boxes have two unintended pernicious consequences; they provide an opportunity for physicians to gloss over important aspects of the note and they reduce our ability to assess the quality of the note. EHRs also create larger boxes within which specific information is entered. The problem with this idea is that the information associated with QNOTE's elements may be scattered throughout the note. For example, the HPI is a chronological story that explores the patient's chief complaint.21 The HPI contains all the information relevant to the chief complaint and it can include information from many parts of the note. Furthermore, complex patients can generate complex notes that do not easily fit into discrete boxes. Finally, a recent survey of physicians found that onscreen boxes make it more difficult to practice medicine, turning face time into screen time, and thus into physician frustraction.22 In would be better if we assist physicians with the non-core elements of their notes, but allow them to write their core elements. We can then use QNOTE, in conjunction with natural language processing, to assess the core elements and note quality. There has been a great deal of interest in the copying and pasting of information from previous notes into the current note.23,24 Physicians have been copying from previous notes since long before the introduction of the EHR. It was a time-consuming but necessary activity to carry forward and incorporate relevant clinical information into the daily note. What is different today is that, because little effort is required to cut and paste material in an EHR, indiscriminate copying and pasting is occurring; resulting in bloated and confusing notes that contain redundant, outdated, and even incorrect information. Eliminating the ability of physicians to cut and paste is not a viable option because there are many occasions when important past information needs to be included in the current note. QNOTE is sensitive to inappropriate cutting and pasting; when it occurs the QNOTE score suffers. Thus, one way to end this practice is to evaluate clinicians’ notes and provide them with feedback regarding note quality. The clinical note is the basis for medical coding and billing; billing levels are based on the evaluation and management information contained in the note. In recent years there has been an increase in upcoding; between 2001 and 2010, physicians have significantly increased their billing for higher-level E&M codes.25 There is a proposal that the American Medical Association's proprietary CPT coding system require that physicians provide substantially more documentation of their medical decision making.26 If this is approved, more attention will have to be paid to the quality of information in the note. In addition, the new International Classification of Diseases (ICD-10-CM) requires more documentation.27 Currently, there is no standard for assessing note quality and without a standard it is difficult to provide physicians with the systematic feedback they need to improve and maintain the quality of their notes. QNOTE can assist physicians in the evaluation of their notes and it can help them discover and correct deficits in their documentation. The physician–patient encounter can be analyzed in terms of at least three interrelated levels of analysis. The first level is the clinical note, which is assessed in terms of the quality of its elements, that is, clinical note quality. The second level is the clinical encounter, which is assessed in terms of the quality of the physicians’ collection of the relevant clinical information, their analysis of the information, and their clinical plan based on their analysis, that is, clinical quality. The third level is the clinical outcome, which is assessed in terms of the health of the patient, where health is usually defined in terms of one or more clinical outcomes, that is, clinical outcomes. In this view, a high-quality clinical note is necessary, but not sufficient, for determining the quality of the clinical encounter. In other words, if the clinical note is not of sufficient quality, then the clinical quality of the clinical encounter cannot be determined. However, a high-quality note does not guarantee a high-quality clinical encounter. Likewise, a high-quality clinical note is necessary but not sufficient for assessing patient outcomes. We selected patients with type 2 diabetes because their visits are generalizable to all but the simplest primary care visits and their visits require full and complete clinical notes. Type 2 diabetes is the second most common diagnosis, after hypertension, in ambulatory medicine.28 Patient visits can be for only diabetes, which includes tests and medications, but more commonly they include the management of multiple comorbid conditions. Although we call these visits diabetic because the patients have type 2 diabetes, in reality these patients have most major diseases, including cardiovascular, renal, neurological, infectious, and musculoskeletal, and they exhibit the full range of severity of illness.29 In a recent population-based survey of 3761 adults with type 2 diabetes, 82.4% reported one or more morbidities, with a mean of 2.4 comorbidities,29 and the average patient with diabetes takes three or more medications.30 In other words, visits of patients with diabetes involve a wide variety of diseases representing the spectrum of clinical medicine and, because of the patient's complexity, the physician's clinical note must be detailed, accurate, and complete. Acute, self-limiting illness patient visits usually have too brief a note for the proper evaluation of their quality. This study has several limitations. First, it was performed using MHS records. It is well established that the MHS distribution of patients is similar to the civilian community and it provides equivalent care to the civilian community.31–34 Second, although the MHS EHR is similar in most respects to commercial EHRs, these results may not be generalizable to all EHR systems. Third, we know of no MHS-wide billing or diabetes quality improvement initiatives related to the clinical note, but if there were any, this might account for some, but not most, of our findings. Fourth, we assessed primary care clinic visits; we do not know the relationship between the introduction of the EHR and specialty clinic visit note quality. Our study has several strengths. It was a large five and a half year longitudinal multicenter trial that used the validated QNOTE instrument to rigorously evaluate the impact of the EHR on the quality of clinical notes. This is the first longitudinal quantitative assessment of the quality of outpatient clinical notes using a validated instrument.

CONCLUSIONS

The EHR significantly improved the overall quality of the outpatient clinical note and the quality of all its elements, including the core and non-core elements. To our knowledge, this is the first study to demonstrate that the EHR significantly improves the quality of clinical notes.
  29 in total

1.  Implementation of a peer review process to improve documentation consistency of care process indicators in the EMR in a primary care setting.

Authors:  Jessica L Milchak; Roberta L Shanahan; Jane A Kerzee
Journal:  J Manag Care Pharm       Date:  2012 Jan-Feb

2.  America's "undiscovered" laboratory for health services research.

Authors:  Ronald W Gimbel; Louis Pangaro; Galen Barbour
Journal:  Med Care       Date:  2010-08       Impact factor: 2.983

3.  Generating Clinical Notes for Electronic Health Record Systems.

Authors:  S Trent Rosenbloom; William W Stead; Joshua C Denny; Dario Giuse; Nancy M Lorenzi; Steven H Brown; Kevin B Johnson
Journal:  Appl Clin Inform       Date:  2010-01-01       Impact factor: 2.342

4.  A piece of my mind. Copy-and-paste.

Authors:  Robert E Hirschtick
Journal:  JAMA       Date:  2006-05-24       Impact factor: 56.272

5.  Documentation and coding of ED patient encounters: an evaluation of the accuracy of an electronic medical record.

Authors:  Eric Silfen
Journal:  Am J Emerg Med       Date:  2006-10       Impact factor: 2.469

6.  The impact of concordant and discordant comorbidities on patient-assessed quality of diabetes care.

Authors:  Eindra Aung; Maria Donald; Joseph Coll; Jo Dower; Gail M Williams; Suhail A R Doi
Journal:  Health Expect       Date:  2013-10-24       Impact factor: 3.377

7.  An effort to improve electronic health record medication list accuracy between visits: patients' and physicians' response.

Authors:  Maria Staroselsky; Lynn A Volk; Ruslana Tsurikova; Lisa P Newmark; Margaret Lippincott; Irina Litvak; Anne Kittler; Tiffany Wang; Jonathan Wald; David W Bates
Journal:  Int J Med Inform       Date:  2007-04-16       Impact factor: 4.046

8.  Medical education & health informatics: time to join the 21st century?

Authors:  Nicola Shaw
Journal:  Stud Health Technol Inform       Date:  2010

9.  Trends in primary care clinician perceptions of a new electronic health record.

Authors:  Robert El-Kareh; Tejal K Gandhi; Eric G Poon; Lisa P Newmark; Jonathan Ungar; Stuart Lipsitz; Thomas D Sequist
Journal:  J Gen Intern Med       Date:  2009-04       Impact factor: 5.128

Review 10.  The impact of eHealth on the quality and safety of health care: a systematic overview.

Authors:  Ashly D Black; Josip Car; Claudia Pagliari; Chantelle Anandan; Kathrin Cresswell; Tomislav Bokun; Brian McKinstry; Rob Procter; Azeem Majeed; Aziz Sheikh
Journal:  PLoS Med       Date:  2011-01-18       Impact factor: 11.069

View more
  17 in total

Review 1.  Biomedical informatics advancing the national health agenda: the AMIA 2015 year-in-review in clinical and consumer informatics.

Authors:  Kirk Roberts; Mary Regina Boland; Lisiane Pruinelli; Jina Dcruz; Andrew Berry; Mattias Georgsson; Rebecca Hazen; Raymond F Sarmiento; Uba Backonja; Kun-Hsing Yu; Yun Jiang; Patricia Flatley Brennan
Journal:  J Am Med Inform Assoc       Date:  2017-04-01       Impact factor: 4.497

2.  Development and Establishment of Initial Validity Evidence for a Novel Tool for Assessing Trainee Admission Notes.

Authors:  Danielle E Weber; Justin D Held; Roman A Jandarov; Matthew Kelleher; Ben Kinnear; Dana Sall; Jennifer K O'Toole
Journal:  J Gen Intern Med       Date:  2020-01-28       Impact factor: 5.128

3.  Clinical Note Section Detection Using a Hidden Markov Model of Unified Medical Language System Semantic Types.

Authors:  Aaron S Eisman; Katherine A Brown; Elizabeth S Chen; Indra Neil Sarkar
Journal:  AMIA Annu Symp Proc       Date:  2022-02-21

4.  Computer versus physician identification of gastrointestinal alarm features.

Authors:  Christopher V Almario; William D Chey; Sentia Iriana; Francis Dailey; Karen Robbins; Anish V Patel; Mark Reid; Cynthia Whitman; Garth Fuller; Roger Bolus; Buddy Dennis; Rey Encarnacion; Bibiana Martinez; Jennifer Soares; Rushaba Modi; Nikhil Agarwal; Aaron Lee; Scott Kubomoto; Gobind Sharma; Sally Bolus; Lin Chang; Brennan M R Spiegel
Journal:  Int J Med Inform       Date:  2015-07-26       Impact factor: 4.046

5.  Two-year longitudinal assessment of physicians' perceptions after replacement of a longstanding homegrown electronic health record: does a J-curve of satisfaction really exist?

Authors:  David A Hanauer; Greta L Branford; Grant Greenberg; Sharon Kileny; Mick P Couper; Kai Zheng; Sung W Choi
Journal:  J Am Med Inform Assoc       Date:  2017-04-01       Impact factor: 4.497

6.  Learning to use electronic health records: can we stay patient-centered? A pre-post intervention study with family medicine residents.

Authors:  Cédric Lanier; Melissa Dominicé Dao; Patricia Hudelson; Bernard Cerutti; Noëlle Junod Perron
Journal:  BMC Fam Pract       Date:  2017-05-26       Impact factor: 2.497

7.  Using health-system-wide data to understand hepatitis B virus prophylaxis and reactivation outcomes in patients receiving rituximab.

Authors:  Gabriela Schmajuk; Chris Tonner; Laura Trupin; Jing Li; Urmimala Sarkar; Dana Ludwig; Stephen Shiboski; Marina Sirota; R Adams Dudley; Sara Murray; Jinoos Yazdany
Journal:  Medicine (Baltimore)       Date:  2017-03       Impact factor: 1.889

8.  The effect of electronic health record software design on resident documentation and compliance with evidence-based medicine.

Authors:  Yasaira Rodriguez Torres; Jordan Huang; Melanie Mihlstin; Mark S Juzych; Heidi Kromrei; Frank S Hwang
Journal:  PLoS One       Date:  2017-09-21       Impact factor: 3.240

9.  Strategies for improving physician documentation in the emergency department: a systematic review.

Authors:  Diane L Lorenzetti; Hude Quan; Kelsey Lucyk; Ceara Cunningham; Deirdre Hennessy; Jason Jiang; Cynthia A Beck
Journal:  BMC Emerg Med       Date:  2018-10-25

10.  Documentation of Contraception and Pregnancy Intention In Medicaid Managed Care.

Authors:  Heike Thiel de Bocanegra; Alia McKean; Philip Darney; Erin Saleeby; Denis Hulett
Journal:  Health Serv Res Manag Epidemiol       Date:  2018-01-18
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.