Literature DB >> 33211862

Measures of electronic health record use in outpatient settings across vendors.

Sally L Baxter1,2, Nate C Apathy3,4,5,6, Dori A Cross7, Christine Sinsky8, Michelle R Hribar9.   

Abstract

Electronic health record (EHR) log data capture clinical workflows and are a rich source of information to understand variation in practice patterns. Variation in how EHRs are used to document and support care delivery is associated with clinical and operational outcomes, including measures of provider well-being and burnout. Standardized measures that describe EHR use would facilitate generalizability and cross-institution, cross-vendor research. Here, we describe the current state of outpatient EHR use measures offered by various EHR vendors, guided by our prior conceptual work that proposed seven core measures to describe EHR use. We evaluate these measures and other reporting options provided by vendors for maturity and similarity to previously proposed standardized measures. Working toward improved standardization of EHR use measures can enable and accelerate high-impact research on physician burnout and job satisfaction as well as organizational efficiency and patient health.
© The Author(s) 2020. Published by Oxford University Press on behalf of the American Medical Informatics Association.

Entities:  

Keywords:  audit log; burnout; electronic health records; measure; metric; vendor

Mesh:

Year:  2021        PMID: 33211862      PMCID: PMC8068413          DOI: 10.1093/jamia/ocaa266

Source DB:  PubMed          Journal:  J Am Med Inform Assoc        ISSN: 1067-5027            Impact factor:   4.497


INTRODUCTION

Although electronic health records (EHRs) have been widely adopted,, their use has been associated with decreased physician satisfaction, decreased productivity, and burnout. Single-center studies concerning physicians’ EHR use and attendant burden are difficult to generalize., To address this challenge, some studies have used audit logs from a single vendor to study EHR use across institutions., Yet the burgeoning literature concerning EHR audit log analysis remains somewhat limited in generalizability due to variations in measurement. Absent standardized measures of EHR use that can be utilized across vendor platforms and institutions, broad conclusions regarding the impact of EHRs on burden and burnout will remain elusive. In an effort to improve standardization, core EHR use measures using log data were recently proposed for outpatient physicians (Table 1). These measures capture major domains of EHR use that impact clinical outcomes and physician satisfaction., Some EHR vendors provide “off-the-shelf” EHR use measures for their clients. These are widely scaled solutions that offer opportunity to learn across healthcare settings, but concerns have been raised regarding vendor-provided measure validation, across-vendor inconsistencies in task and measure definitions, and within-vendor changes to measurement methodologies over time. This study’s purpose was to describe the current state of vendor-provided outpatient EHR use measures. Understanding vendors’ development of these measures can help inform the ongoing process of developing standardized measures.
Table 1.

Crosswalk of vendor-provided measures against proposed measures of outpatient physician electronic health record (EHR) use

Proposed EHR use measureProposed EHR use measure definitionVendor measure alignment with proposed measures
Cerner Epic AllScripts
Total EHR timeTotal time on EHR (during and outside of clinic sessions) per 8 h of patient scheduled time. * * ??
Work outside of workTime on EHR outside of scheduled patient hours per 8 h of patient scheduled time. ** ** N/A
Time on encounter note documentationTotal time on documentation (note writing) per 8 h of patient scheduled time. * * ??
Time on prescriptionsTotal time on prescriptions per 8 h of patient scheduled time. ** ** ??
Time on inboxTotal time on inbox per 8 h of patient scheduled time. Proposed numerator includes time spent on actions originating from inbox messages as well as inbox time. ** ** ??
Teamwork for ordersPercentage of orders with team contribution.a N/A
Undivided attentionAmount of undivided attention patients receive from their physician, approximated by [(total time per session) minus (EHR time per session)]/total time per session.N/AN/AN/A

N/A = vendor does not offer any measures in this domain.

= vendor offers measure in this domain, but denominator differs from proposed “per 8 h of scheduled clinic time.”

= vendor offers measure in this domain, but both numerator and denominator differ from proposed measure.

= vendor offers measure in this domain, and measure does not differ meaningfully from proposed measure.

= vendor offers measure in this domain, but extent of alignment with proposed measure is unclear.

Although Cerner does not directly provide this measure to users, a dashboard dedicated to ordering and personnel does include the numbers of orders placed by providers and care team members, via computerized physician order entry (CPOE) and other means. This dashboard contains the values necessary to compute the percentage of orders with team contribution.

Crosswalk of vendor-provided measures against proposed measures of outpatient physician electronic health record (EHR) use N/A = vendor does not offer any measures in this domain. = vendor offers measure in this domain, but denominator differs from proposed “per 8 h of scheduled clinic time.” = vendor offers measure in this domain, but both numerator and denominator differ from proposed measure. = vendor offers measure in this domain, and measure does not differ meaningfully from proposed measure. = vendor offers measure in this domain, but extent of alignment with proposed measure is unclear. Although Cerner does not directly provide this measure to users, a dashboard dedicated to ordering and personnel does include the numbers of orders placed by providers and care team members, via computerized physician order entry (CPOE) and other means. This dashboard contains the values necessary to compute the percentage of orders with team contribution.

METHODS

Based on internal data from the Office of National Coordinator for Health Information Technology, we included the following vendors with leading outpatient market share in the USA: Epic Systems® (Verona, WI), Cerner Corporation® (Kansas City, MO), AllScripts Healthcare Solutions® (Chicago, IL), eClinicalWorks® (Westborough, MA), athenahealth® (Watertown, MA), and NextGen Healthcare® (Irvine, CA). We asked vendor representatives about the availability of EHR use measures from December 2019 to May 2020. For vendors with available measures, we collected data regarding measure definitions via semistructured interviews with representatives, including technical personnel directly involved in developing and implementing off-the-shelf measures. Interviews involved exchanges between two or three members of the vendor team and two or three members of our research team. For Epic and Cerner, where we had direct user access (eg, via clinical practice at our affiliated health systems), we also explored the vendor-provided EHR usage platforms directly. We describe vendors’ individual approaches to measuring development and operationalization and assessing alignment to the previously proposed measures to inform ongoing standardization efforts.

RESULTS

Stages of vendor-provided measure development

Vendors vary in the maturity of development of outpatient EHR use measures (Figure 1). NextGen has not developed any programs to accumulate usage data. Although eClinicalWorks makes EHR use metadata accessible for development of custom measures by third-party vendors, it does not provide any native measures. athenahealth is in the process of developing measures, but these are not yet available. The remaining vendors (Epic, Cerner, and AllScripts) have developed EHR use measures in platforms available to clients.
Figure 1.

Continuum of development of outpatient EHR use measures for vendors with leading market share as of spring 2020.

Continuum of development of outpatient EHR use measures for vendors with leading market share as of spring 2020.

General descriptions of EHR use measures and platforms

AllScripts offers an analytics platform based on patient cohorts (“population sets”) to generate EHR utilization reports. The population sets include broad cohorts (eg, all visits, inpatients, outpatients) and complex cohorts with specific definitions, often based on client requests (eg, sepsis, type 1 diabetics admitted with ketoacidosis). Utilization reports are generated for a given population set and include the following domains: clinical documentation, computerized physician order entry, order set utilization, alerts, knowledge-based medication administration, and “EHR audit reports.” Within these domains, user reports can count both number of activities and time expenditure per activity. Visualization options include an “event timeline by user” that displays event actions and times for individual physicians, and—at the site or system level—an event density map that depicts the distribution of EHR tasks across users over time. Cerner provides EHR use measures via the Lights On platform, which summarizes commonly reviewed measures in a dashboard and benchmarks user metrics to national specialty metrics. Measures are calculated at the user, specialty, facility, and health system levels, and aggregated to monthly averages by default. Lights On can be configured to record measures across outpatient, inpatient, and emergency department venues, with measures spanning 42 domains (eg, alerts, documentation, orders). Lights On provides broad measures of EHR use within each domain, such as types and counts of activities (eg, number of documents signed), proportions of clinical activity meeting certain criteria (eg, percentage of prescriptions sent electronically), time associated with EHR use activities (eg, after-hours work, defined as time spent in the EHR before 6 am, after 6 pm, and on weekends), and how frequently clinicians use recommended workflows. These measures are normalized to a per-patient-seen denominator, measured as the number of unique patients for whom a provider signed a clinical note during the time period. Measure data in Lights On are available at daily, weekly, monthly, quarterly, and yearly levels, and users can download historical monthly data for all measures. Users can visualize measures over time at different levels of aggregation, and Lights On includes functionality to perform ad hoc pre/post analyses of the impact of system changes (eg, upgrades or workflow configuration changes) on use measures at more granular time intervals. Users can access definitions of measures via help links within Lights On, and Cerner maintains wiki-style resource pages with additional details of measure methodology. Epic provides EHR use measures via Signal, a platform providing measures in the following domains: in-basket, orders, notes, workload, clinical review, and “other” (assorted measures such as total time in system per day and time in schedule per day). Measures are generated at the provider level and can be aggregated to the specialty, department, or health system levels. Some measures are normalized with different denominators (eg, time in notes per day and per appointment). Several measures describe after-hours work, such as time outside scheduled hours (with 30-min buffers before the first appointment and after the last appointment), time on unscheduled days with no appointments, time outside 7 am to 7 pm, and “pajama time,” which is time outside 7 am to 5:30 pm on weekdays and time on weekends. Measures are calculated based on monthly reporting periods. Measures can be displayed longitudinally over the course of a workday or month-to-month and are compared with averages of all Epic users of the same specialty. Besides time-based measures, Epic also offers measures of use of personalization/efficiency tools. Detailed measure definitions are documented in a library. Currently, this platform is available for only outpatient attending physicians and not other physicians (eg, inpatient providers, residents/fellows) or clinician types (eg, nurses).

Comparing vendor-provided measures with proposed measures

Table 1 depicts how vendor-provided measures compare with the previously proposed measures. The proposed measures are generally centered on time spent on various EHR activities, normalized to an 8-h period of scheduled patient time. EHR vendors provide measures in most domains. Supplementary Appendix 1 details comparisons of vendor-provided measures in each domain. Although vendors provide numerous measures, our comparisons focus on the proposed measure domains only. There are several differences between vendor-provided measures and proposed measures. Total EHR time and time on documentation, prescriptions, inbox, and outside work are all variably defined. For example, vendors’ measures normalize to per-patient or per-day, whereas the proposed measures suggest normalizing to 8 hours of scheduled patient time. There are also varying definitions of the numerator for “work outside of work.” For documentation, vendors also are more likely to split subtasks (ie, separating clinical review and note-writing) rather than aggregating them like the proposed measure. For inbox-related work, vendors measure time strictly in the inbox, whereas the proposed measure would also include time on immediate actions related to inbox messages. Finally, undivided attention is unavailable across all vendors.

DISCUSSION

We assessed vendor-provided EHR use measures and compared them with measures recently proposed by a multidisciplinary informatics workgroup. Our key findings were: (1) several vendors were in early stages of measure development; (2) differences between vendor-provided measures and proposed measures carry important implications for interpretation and cross-vendor comparison; and (3) ongoing work is needed to improve standardization. First, although three vendors offered measures and well-developed platforms, the remaining vendors have not developed measures or are still developing them. This represents an opportunity for early engagement regarding measure definitions and standardization. Even for vendors with existing platforms, there are some domains (eg, undivided attention) that are still undeveloped, again representing an opportunity for early alignment with vendors as measure libraries expand. Second, differences in measure definitions have several important implications. Vendors often normalize measured time as per patient or per day. However, per-day calculations do not account for variations in physician schedules, ie, half-day vs full-day clinics. In contrast, the proposed measures normalize to 8 hours of scheduled patient time. This enables a more generalizable comparison across variable scheduling templates, but does not account for varying clinical volumes. Both time and clinical volume are important domains for measure standardization. Future work and engagement with physicians and healthcare organizations can inform how best to incorporate clinical volume into an expanded set of standardized measures. Difficulty of standardization is readily apparent in work outside of work measures. Variations across vendors illustrate the difficulty of measuring this construct, which is particularly important given that after-hours EHR use has been associated with physician burnout.,, Vendors’ calculations may vary depending on whether physicians had multiple shorter clinic sessions on different calendar days vs a longer clinic session on a single day. For example, vendor-provided measures subtract buffer periods before and after clinic hours from work outside of work time, which is arbitrary and may not reflect the scheduling practices or preferences of specific clinicians. In contrast, the proposed measure is normalized to 8 hours of scheduled patient time without any subtraction of time. However, the proposed measure could overestimate work outside of work if the physician is still seeing patients in the clinic past the time of the last scheduled appointment. This may happen when clinics run long or when patients have multistep appointments that include intake activities and ancillary testing; thus, this clinic work may not be accurately labeled as “outside” work time. Overall, work outside of work remains challenging to define for vendors, researchers, and clinicians alike. Another example of difficulties in accounting for provider time are demonstrated through time spent on inbox management. Inbox and messaging time may be underestimated by vendor-provided measures that strictly count time spent viewing or writing messages. In addition, using the inbox for specific tasks (eg, embedded order entry) may vary among different EHR systems. Resolving inbox messages often requires work in areas of the EHR other than the inbox, and may even involve tasks that are not reflected in EHR log data at all (eg, phone calls to consulting providers or to patients). The proposed measure would include these actions, although best practices regarding how to define actions related to inbox messages are still evolving., This is critically important because inbox-related activities pose a well-documented burden on physicians, which will only increase with widespread adoption of telehealth and asynchronous patient engagement in response to the COVID-19 pandemic. Of note, vendors reported that measure data for reporting periods at the onset of the pandemic were skewed due to rapid fluctuations in the volume of both inbox messages and patient appointments. These examples illustrate that both “time on the clock” and clinical volume are important domains for measure standardization. Ongoing work is needed for standardizing EHR use measures. One strategy would be to evaluate aligning vendor-provided measures into the standardized proposed measures. This may depend on developing standardized terminology around EHR use and audit log elements, especially because vendors offer different levels of granularity. Furthermore, to calculate these measures, vendors often use other logs, which are more granular than the regulated audit logs needed for compliance with federal regulations. Developing standardized terminology for EHR logs is ongoing. In the near term, matched measures on the same vendor platform can enable cross-institution research,, a valuable first step towards more generalizable evidence. Over the long term, standardization will not only require consensus on appropriate measure definitions, but also interorganizational collaboration and ongoing maintenance. Although prior studies have criticized vendors, the initiatives of vendors to develop these measures and engage in conversations about standardization, even without top-down regulatory requirements, is encouraging. Our study had limitations. We did not examine measures for inpatient and other settings. We did not review all vendor-provided measures and focused on those pertaining to the previously proposed core measures, which have not been extensively validated. Next, we focused on large EHR vendors; future studies are needed to evaluate smaller EHR vendors. Finally, we focused on measure definitions but did not analyze various forms of data visualization on vendors’ platforms. Developing best practices around data visualization would be valuable for future investigation. In summary, our findings help identify high-priority alignment of EHR vendor measures against a set of proposed standardized outpatient EHR use measures. We identify measures readily available from vendors that can be used in studies of EHR use and attendant burden. Understanding variations and working toward standardization will facilitate future work to compare studies across diverse organizations in the effort to measure and improve physician burnout.

FUNDING

This study was supported in part by NIH grants T15LM011271 and T15LM012502.

AUTHOR CONTRIBUTIONS

Each author made substantial contributions to the conception or design of the work; was involved in drafting the work or revising it critically for important intellectual content; gave final approval of the version to be published; and has agreed to be accountable for all aspects of the work.

SUPPLEMENTARY MATERIAL

Supplementary material is available at Journal of the American Medical Informatics Association online Click here for additional data file.
  17 in total

Review 1.  Effects of health information technology on patient outcomes: a systematic review.

Authors:  Samantha K Brenner; Rainu Kaushal; Zachary Grinspan; Christine Joyce; Inho Kim; Rhonda J Allard; Diana Delgado; Erika L Abramson
Journal:  J Am Med Inform Assoc       Date:  2015-11-13       Impact factor: 4.497

2.  Physicians' Well-Being Linked To In-Basket Messages Generated By Algorithms In Electronic Health Records.

Authors:  Ming Tai-Seale; Ellis C Dillon; Yan Yang; Robert Nordgren; Ruth L Steinberg; Teresa Nauenberg; Tim C Lee; Amy Meehan; Jinnan Li; Albert Solomon Chan; Dominick L Frosch
Journal:  Health Aff (Millwood)       Date:  2019-07       Impact factor: 6.301

3.  The impact of electronic health record use on physician productivity.

Authors:  Julia Adler-Milstein; Robert S Huckman
Journal:  Am J Manag Care       Date:  2013-11       Impact factor: 2.229

4.  Virtually Perfect? Telemedicine for Covid-19.

Authors:  Judd E Hollander; Brendan G Carr
Journal:  N Engl J Med       Date:  2020-03-11       Impact factor: 91.245

5.  Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations.

Authors:  Brian G Arndt; John W Beasley; Michelle D Watkinson; Jonathan L Temte; Wen-Jan Tuan; Christine A Sinsky; Valerie J Gilchrist
Journal:  Ann Fam Med       Date:  2017-09       Impact factor: 5.166

6.  Using electronic health record audit logs to study clinical activity: a systematic review of aims, measures, and methods.

Authors:  Adam Rule; Michael F Chiang; Michelle R Hribar
Journal:  J Am Med Inform Assoc       Date:  2020-03-01       Impact factor: 4.497

7.  Electronic health records and burnout: Time spent on the electronic health record after hours and message volume associated with exhaustion but not with cynicism among primary care clinicians.

Authors:  Julia Adler-Milstein; Wendi Zhao; Rachel Willard-Grace; Margae Knox; Kevin Grumbach
Journal:  J Am Med Inform Assoc       Date:  2020-04-01       Impact factor: 4.497

8.  Metrics for assessing physician activity using electronic health record log data.

Authors:  Christine A Sinsky; Adam Rule; Genna Cohen; Brian G Arndt; Tait D Shanafelt; Christopher D Sharp; Sally L Baxter; Ming Tai-Seale; Sherry Yan; You Chen; Julia Adler-Milstein; Michelle Hribar
Journal:  J Am Med Inform Assoc       Date:  2020-04-01       Impact factor: 4.497

9.  COVID-19 transforms health care through telemedicine: Evidence from the field.

Authors:  Devin M Mann; Ji Chen; Rumi Chunara; Paul A Testa; Oded Nov
Journal:  J Am Med Inform Assoc       Date:  2020-07-01       Impact factor: 4.497

10.  Telehealth transformation: COVID-19 and the rise of virtual care.

Authors:  Jedrek Wosik; Marat Fudim; Blake Cameron; Ziad F Gellad; Alex Cho; Donna Phinney; Simon Curtis; Matthew Roman; Eric G Poon; Jeffrey Ferranti; Jason N Katz; James Tcheng
Journal:  J Am Med Inform Assoc       Date:  2020-06-01       Impact factor: 4.497

View more
  11 in total

1.  Predicting physician burnout using clinical activity logs: Model performance and lessons learned.

Authors:  Sunny S Lou; Hanyang Liu; Benjamin C Warner; Derek Harford; Chenyang Lu; Thomas Kannampallil
Journal:  J Biomed Inform       Date:  2022-02-05       Impact factor: 6.317

2.  Primary care physicians' electronic health record proficiency and efficiency behaviors and time interacting with electronic health records: a quantile regression analysis.

Authors:  Oliver T Nguyen; Kea Turner; Nate C Apathy; Tanja Magoc; Karim Hanna; Lisa J Merlo; Christopher A Harle; Lindsay A Thompson; Eta S Berner; Sue S Feldman
Journal:  J Am Med Inform Assoc       Date:  2022-01-29       Impact factor: 4.497

3.  Assessing the impact of patient access to clinical notes on clinician EHR documentation.

Authors:  A Jay Holmgren; Nate C Apathy
Journal:  J Am Med Inform Assoc       Date:  2022-09-12       Impact factor: 7.942

4.  Measuring Electronic Health Record Use in the Pediatric ICU Using Audit-Logs and Screen Recordings.

Authors:  Amrita Sinha; Lindsay A Stevens; Felice Su; Natalie M Pageler; Daniel S Tawfik
Journal:  Appl Clin Inform       Date:  2021-08-11       Impact factor: 2.762

5.  Temporal Associations Between EHR-Derived Workload, Burnout, and Errors: a Prospective Cohort Study.

Authors:  Sunny S Lou; Daphne Lew; Derek R Harford; Chenyang Lu; Bradley A Evanoff; Jennifer G Duncan; Thomas Kannampallil
Journal:  J Gen Intern Med       Date:  2022-06-16       Impact factor: 6.473

6.  Measuring time clinicians spend using EHRs in the inpatient setting: a national, mixed-methods study.

Authors:  Genna R Cohen; Jessica Boi; Christian Johnson; Llew Brown; Vaishali Patel
Journal:  J Am Med Inform Assoc       Date:  2021-07-30       Impact factor: 4.497

7.  Health information technology and clinician burnout: Current understanding, emerging solutions, and future directions.

Authors:  Eric G Poon; S Trent Rosenbloom; Kai Zheng
Journal:  J Am Med Inform Assoc       Date:  2021-04-23       Impact factor: 4.497

8.  Association between state-level malpractice environment and clinician electronic health record (EHR) time.

Authors:  A Jay Holmgren; Lisa Rotenstein; Norman Lance Downing; David W Bates; Kevin Schulman
Journal:  J Am Med Inform Assoc       Date:  2022-05-11       Impact factor: 7.942

9.  Analysis of Electronic Health Record Use and Clinical Productivity and Their Association With Physician Turnover.

Authors:  Edward R Melnick; Allan Fong; Bidisha Nath; Brian Williams; Raj M Ratwani; Richard Goldstein; Ryan T O'Connell; Christine A Sinsky; Daniel Marchalik; Mihriye Mete
Journal:  JAMA Netw Open       Date:  2021-10-01

10.  Resident Physician Experience and Duration of Electronic Health Record Use.

Authors:  A Jay Holmgren; Brenessa Lindeman; Eric W Ford
Journal:  Appl Clin Inform       Date:  2021-08-04       Impact factor: 2.762

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.