James M Kariuki1, Eric-Jan Manders1, Janise Richards1, Tom Oluoch2, Davies Kimanga3, Steve Wanyee4, James O Kwach2, Xenophon Santas1. 1. Division of Global HIV & TB, Center for Global Health, Centers for Disease Control and Prevention, Atlanta, USA. 2. Centers for Disease Control and Prevention-Kenya, Nairobi, Kenya. 3. National AIDS & STI Control Programme (NASCOP), Nairobi, Kenya. 4. The International Training and Education Center for Health (I-TECH) - Kenya, Nairobi, Kenya.
Abstract
Introduction: Developing countries are increasingly strengthening national health information systems (HIS) for evidence-based decision-making. However, the inability to report indicator data automatically from electronic medical record systems (EMR) hinders this process. Data are often printed and manually re-entered into aggregate reporting systems. This affects data completeness, accuracy, reporting timeliness, and burdens staff who support routine indicator reporting from patient-level data. Method: After conducting a feasibility test to exchange indicator data from Open Medical Records System (OpenMRS) to District Health Information System version 2 (DHIS2), we conducted a field test at a health facility in Kenya. We configured a field-test DHIS2 instance, similar to the Kenya Ministry of Health (MOH) DHIS2, to receive HIV care and treatment indicator data and the KenyaEMR, a customized version of OpenMRS, to generate and transmit the data from a health facility. After training facility staff how to send data using DHIS2 reporting module, we compared completeness, accuracy and timeliness of automated indicator reporting with facility monthly reports manually entered into MOH DHIS2. Results: All 45 data values in the automated reporting process were 100% complete and accurate while in manual entry process, data completeness ranged from 66.7% to 100% and accuracy ranged from 33.3% to 95.6% for seven months (July 2013-January 2014). Manual tally and entry process required at least one person to perform each of the five reporting activities, generating data from EMR and manual entry required at least one person to perform each of the three reporting activities, while automated reporting process had one activity performed by one person. Manual tally and entry observed in October 2013 took 375 minutes. Average time to generate data and manually enter into DHIS2 was over half an hour (M=32.35 mins, SD=0.29) compared to less than a minute for automated submission (M=0.19 mins, SD=0.15). Discussion and Conclusion: The results indicate that indicator data sent electronically from OpenMRS-based EMR at a health facility to DHIS2 improves data completeness, eliminates transcription errors and delays in reporting, and reduces the reporting burden on human resources. This increases availability of quality indicator data using available resources to facilitate monitoring service delivery and measuring progress towards set goals.
Introduction: Developing countries are increasingly strengthening national health information systems (HIS) for evidence-based decision-making. However, the inability to report indicator data automatically from electronic medical record systems (EMR) hinders this process. Data are often printed and manually re-entered into aggregate reporting systems. This affects data completeness, accuracy, reporting timeliness, and burdens staff who support routine indicator reporting from patient-level data. Method: After conducting a feasibility test to exchange indicator data from Open Medical Records System (OpenMRS) to District Health Information System version 2 (DHIS2), we conducted a field test at a health facility in Kenya. We configured a field-test DHIS2 instance, similar to the Kenya Ministry of Health (MOH) DHIS2, to receive HIV care and treatment indicator data and the KenyaEMR, a customized version of OpenMRS, to generate and transmit the data from a health facility. After training facility staff how to send data using DHIS2 reporting module, we compared completeness, accuracy and timeliness of automated indicator reporting with facility monthly reports manually entered into MOH DHIS2. Results: All 45 data values in the automated reporting process were 100% complete and accurate while in manual entry process, data completeness ranged from 66.7% to 100% and accuracy ranged from 33.3% to 95.6% for seven months (July 2013-January 2014). Manual tally and entry process required at least one person to perform each of the five reporting activities, generating data from EMR and manual entry required at least one person to perform each of the three reporting activities, while automated reporting process had one activity performed by one person. Manual tally and entry observed in October 2013 took 375 minutes. Average time to generate data and manually enter into DHIS2 was over half an hour (M=32.35 mins, SD=0.29) compared to less than a minute for automated submission (M=0.19 mins, SD=0.15). Discussion and Conclusion: The results indicate that indicator data sent electronically from OpenMRS-based EMR at a health facility to DHIS2 improves data completeness, eliminates transcription errors and delays in reporting, and reduces the reporting burden on human resources. This increases availability of quality indicator data using available resources to facilitate monitoring service delivery and measuring progress towards set goals.
Entities:
Keywords:
DHIS2; Health information systems; OpenMRS; data aggregation; electronic medical records; health information exchange; indicator reporting; interoperability
The fight against HIV has played a major role in the implementation and use of Health
Information Systems (HIS) in many low and middle-income countries for management of
longitudinal health records. The number of HIV patients enrolled on antiretroviral
therapy (ART) has increased exponentially over the last ten years due to improved
access to HIV testing and revised guidelines requiring early initiation of ART among
those infected [1-5]. This has accelerated further adoption and scale up of
electronic medical records (EMR) and monitoring and evaluation (M&E) information
systems to monitor patient and program outcomes [6]. International donor
organizations, such as the Joint United Nations Programme on HIV/AIDS’
(UNAIDS) and the United States Government’s President’s Emergency Plan
for AIDS Relief (PEPFAR), have set ambitious goals towards ending the epidemic and
developed comprehensive indicators for monitoring progress made through various
programs [7-9]. Achieving these goals requires a shift to a data-driven approach
that uses data from the national level down to the service delivery (health
facility) level.eHealth is a key enabler and driver of improved health outcomes and an essential
infrastructure to support information exchange between all participants in the
health care system [10,11]. The ability to use information for monitoring health
service delivery, planning programs, reporting health indicators, measuring
achievement, and improving accountability requires timely, reliable, high-quality,
and accessible health service data. These are key to realizing global health goals,
especially in resource-limited settings [12-14]. Effectively managing health data
requires robust HIS, including both an EMR to manage patient records and an
aggregate data system for M&E. Although some countries have adopted and
implemented both EMR and aggregate system with national coverage, they tend to be
standalone or silo systems [15]. Interoperability among HIS is essential to
achieving health goals by facilitating the availability and use of quality health
data.Despite increased adoption of electronic HIS, the lack of data exchange remains a
challenge to data quality and availability [16]. Printing electronic data from one
system and re-entering it into another system manually is commonplace. Manual data
entry is labor intensive and prone to transcription errors. It increases the time
from when the indicator data are generated in the EMR to its availability in the
aggregate data system, and increases the workload for health workers responsible for
reporting [17,18]. As data demand increases, limited-resource sites may struggle to
hire and sustain the staff needed to support manual data reporting [14,19]. All
these factors can potentially affect the ongoing monitoring of health programs,
planning and resource allocation for health services, and delivery of quality and
efficient healthcare services.In earlier work, we conducted a laboratory-based study examining the feasibility of
automating reporting of a subset of PEPFAR’s next generation indicators from
Open Medical Records System (OpenMRS), an open-source medical records system [20],
to District Health Information System version 2 (DHIS2), a tool for collection,
validation, analysis, and presentation of aggregate statistical data, tailored to
integrated health information management activities [21]. This study demonstrated
that data generated from OpenMRS and sent electronically to DHIS2 can maintain the
accuracy and completeness needed to develop appropriate indicators [22]. It also
indicated that an automated indicator reporting process had the potential to provide
timely health information and reduce staff workload. In this work, we extend our
findings to conduct an experiment with a field-based study exploring the impact of
adopting HIS interoperability at the facility level.
Methods
Field-test study design
We designed a mixed-method field test to compare human resources reporting
efforts, data accuracy and completeness, and timeliness of submitting data
reports from the health facility to the national health management information
system (HMIS) for manual and automated indicator data reporting processes.We conducted the study in two phases and used four data collection methods. In
the first phase, we developed, tested, and implemented the automated software,
while in the second phase we examined the automation’s impact. The four
data collection methods were:Document review of Kenya MOH HIV facility reporting tools and National
AIDS & STI Control Programme (NASCOP) indicator manual to identify
indicators to automate and understand how they are calculated;Desk review of monthly reports at the facility and national levels to
audit data quality and compare reporting time between the two phases for
manual and automated reporting;Focus group discussion with health facility staff to gain their
perspectives on the data collection, aggregation and submission process;
andObservation of the manual reporting process to determine the data
collection and aggregation procedures used by the staff.
Definitions of variables measured
For this study, we defined three variables:Human resources are the staff required to support
indicator data reporting from the facility EMR to the national
HMIS.Data quality focuses on data accuracy and completeness
characteristics of indicator data sent from the facility to the
national HMIS for manual or automated reporting. This is calculated
as a percentage of complete data values entered into national HMIS
accurately out of the expected data values [23].• Data completeness is the degree to
which values of all selected indicator data elements in the
facility monthly reports, generated from EMR or manual
tally, are available in the national HMIS.• Data accuracy is the degree of
concordance between indicator data values in the facility
monthly reports, generated from EMR or manual tally, with
data values in the national HMIS.Reporting time is the time taken to prepare and submit
indicator data reports into the national HMIS from the facility.
Study Setting
We identified Kenya for the field-test study because DHIS2, a tool for
collection, validation, analysis, and presentation of aggregate statistical
data, and OpenMRS, an open-source medical records system, are currently in use
and supported by MOH. Within Kenya, we selected Kisumu East District Hospital
comprehensive care clinic (CCC) because it met our three inclusion criteria: 1)
large number of electronic patient records (over 3,000 patients enrolled on HIV
treatment), 2) Kenya MOH support, and 3) established EMR and data entry
processes. Kisumu East District Hospital CCC is relatively large (approximately
13,185 adult and pediatric patients were receiving HIV care and treatment at the
time of the study) and is operated by the Kenyan MOH. The hospital has used
KenyaEMR (a customized version of OpenMRS) for more than one year as a
point-of-care system and for retrospective data entry of routine patient
data.
Field-test study data collection
We collected data on manual and automated reporting work processes following the
steps shown in Figure 1.
Figure 1
Indicator data reporting field-test study work process
Indicator data reporting field-test study work process
Manual reporting work process
Focus group discussion (FGD): We conducted a FGD with seven key
staff who use KenyaEMR routinely and are responsible for HIV indicator
reporting at the facility. These staff included clinical officers, nursing
officers, health records and information officer (HRIO), and data clerks at
the HIV clinic. During the FGD, we collected information about EMR use,
current indicator data reporting process to MOH DHIS2, and human resources
effort required for reporting (i.e., the number of staff and duration per
month).Observation of the facility reporting process: We observed the
current reporting process by shadowing the facility’s data clerk and
HRIO while they prepared the routine monthly HIV care and treatment report.
We documented the process and recorded the approximate time it took to
accomplish each step using a stopwatch. In addition, we produced the monthly
indicator data report in KenyaEMR and recorded the time to generate a report
during the study period.Manual reporting data quality desk review: We collected copies
of the HIV care and treatment indicator reports from July 2013 to January
2014 at the facility (see Appendix 1
for a sample reporting form) and produced the facility’s monthly
indicator reports from MOH DHIS2 for the same timeframe. Then, we
transferred the data in the monthly reports into the data quality comparison
tool, (Figure 2) to compare the
completeness and accuracy of indicator data submitted to MOH DHIS2 and the
reports at the facility. We counted the number of data elements with
accurate values and data elements with transcription errors or missing
values in the two reports and summarized the results in the same tool. For
manual reporting, we used the report at the facility as reference.
Figure 2
Data quality comparison tool
Data quality comparison tool
Automation implementation
Selection of HIV indicators to automate: First, we reviewed the
MOH Comprehensive HIV/AIDS Facility Reporting Form (Appendix 1) to identify HIV care and treatment
indicators to automate [24]. Selected indicators had data routinely
collected in KenyaEMR at the facility, including all data elements required
to generate aggregate data values.Indicator data automation configuration in the facility
KenyaEMR: To ensure that the study did not interrupt the normal
facility monthly reporting to MOH, we created a separate DHIS2 instance with
identical data elements in the MOH DHIS2 using the OpenMRS to DHIS2
indicator automation guide developed in the previous study [25]. Next, we
generated an XML report definition template with the HIV care and treatment
data elements. Then we mapped the data elements in the report definition
template to the PEPFAR HIV care and treatment indicators to enable us reuse
the SQL code created during the feasibility study [22,26]. Using an SQL
editor, we created query statements to generate each data element value from
KenyaEMR and add them to the XML report definition template. Next, we loaded
the DHIS2 reporting module into the facility KenyaEMR and uploaded the XML
report definition template embedded with SQL queries. Finally, we connected
the two systems by adding field-test DHIS2 login and link details to the
KenyaEMR.Training and report submission into the field-test DHIS2: To
ensure a successful implementation, we trained facility staff responsible
for reporting on sending indicator data electronically from KenyaEMR to
field-test DHIS2 using the DHIS2 Reporting Module. The facility staff
generated and transmitted HIV care and treatment indicator data for seven
months, July 2013 to January 2014, from the facility KenyaEMR to the
field-test DHIS2. After transmission, a report containing indicator data
values and transmission results for each month was saved on a computer. We
also recorded the time to generate and transmit each report from the
KenyaEMR to the field-test DHIS2.Automated reporting data quality desk review: To verify the
completeness and accuracy of the automated indicators, we produced HIV care
and treatment indicators data reports from the field-test DHIS2 and printed
the reports submitted automatically from the KenyaEMR for the seven months.
Then, we compared the data in the two reports for each month. We counted the
number of indicator data elements with identical values, transcription
errors or missing in the two reports and recorded the results in the data
quality comparison tool (Figure2,
above). In the automated reporting process, we used the report produced in
KenyaEMR at the facility as reference.
Analysis
We developed a workflow diagram of the current reporting process at the facility
using the data captured during the FGD and observation notes. We reviewed the
workflow with facility staff to confirm that it captured the current reporting
process (see Current Reporting Process in Figure
3).
Figure 3
Comparison of human resources required for the three reporting
processes
Comparison of human resources required for the three reporting
processesFor the descriptive data analysis, we reviewed data in the data quality
comparison tool and summarized the data on data quality dimensions. Then we
calculated the percentage of complete and accurate indicator data values entered
into the national HMIS out of the total indicator data values expected each
month to compare data quality between the manual and automated reporting
processes. We also reviewed the observation notes and tabulated data on time to
produce and submit reports for each month during the study period in Microsoft
Excel. Then we calculated the average time required to generate and submit the
report for each reporting process. Finally, we developed workflow diagrams on
human resources, graphs to visualize data completeness and accuracy, and a
timeline graphic with mean, median and interquartile range to visualize data
reporting time for manual and automated reporting processes.
Results
Facility reporting process
Data were entered into the facility EMR by clinicians during patient visit or
retrospectively by data clerks. These data were also captured in paper
registers. When planning the field-test, we expected that facility staff
generated and printed indicator data reports from patient data in the KenyaEMR,
then manually entered it into MOH DHIS2. However, the FGD and observations
indicated that staff manually tallied HIV indicator data from several registers
to compile the MOH reporting form before manual entry into MOH DHIS2, bypassing
the KenyaEMR entirely (see Current Reporting Process in Figure 3).In terms of staffing responsible for HIV reporting, one receptionist/data clerk
prepares HIV indicator reports every month, and one of the three health records
and information officers (HRIOs) at the facility reviews and submits it to MOH
DHIS2. The HRIOs have access to MOH DHIS2 through the facility’s internet
and can manually enter and submit facility’s indicator data reports to
MOH.
Human resources burden comparison for manual and automated reporting
This study compares the human resources needed for manual and automated indicator
reporting activities. The current reporting process has five activities, each
requiring at least one person to complete. The data clerk collects registers
from various clinics, tallies and aggregate value for each indicator, compiles
the monthly indicator report form, and sends it to the HRIO office. The facility
HRIO reviews the report, then manually enters the data into DHIS2 (see Current
Reporting Process in Figure 3).The expected reporting process has three activities, each requiring at least one
person to complete. The data clerk generates the monthly indicator data report
from KenyaEMR, prints it, and sends it to HRIO office. The facility HRIO reviews
the report, then manually enters the data into DHIS2 (see Expected Reporting
Process in Figure 3).The automated reporting process has one activity to generate, review, and submit
the indicator report electronically into DHIS2. This activity can be completed
by one HRIO within the EMR (see Automated Reporting Process in Figure 3).
Comparison of data completeness and accuracy between manual and automated
reporting processes
Using the selection criteria, we identified eight indicators for automation in
the HIV care and treatment section of the MOH 731 reporting form. Including
disaggregates; these eight indicators had 45 data elements. Data values for all
45 selected indicator data element submitted electronically from KenyaEMR to
DHIS2 for the seven months were 100% complete and accurate. Manually entered
indicator data averaged 89% completeness (ranging from 66.7% to 100%) and 71%
accuracy (ranging from 33.3% to 95.6%). This indicates that during manual data
entry, some indicator data values were not entered and transcription errors were
introduced. Figure 4 shows the completeness
and accuracy for both processes during the seven-month study period.
Figure 4
Comparison of completeness and accuracy between manual and automated data
elements
Comparison of completeness and accuracy between manual and automated data
elements
Comparison of indicator data reporting time between manual and automated
reporting
Figure 5 shows the time required to prepare
and get indicator data into DHIS2 for each reporting process. While this study
assumes of data contained in KenyaEMR, it includes time for the current manual
reporting process observed in October 2013. Of the 375 minutes needed for the
current process, approximately 345 minutes (92%) of the total time was used to
aggregate and compile indicator data. The 30 minutes required to enter indicator
data values manually into DHIS2 were comparable to the expected reported
process.
Figure 5
Timeline comparing time to generate and submit indicator data into DHIS2
using manual and automated processes
Timeline comparing time to generate and submit indicator data into DHIS2
using manual and automated processes
Discussion
The study findings indicate that it is feasible and beneficial to automate indicator
data reporting from EMRs to aggregate data systems, and that implementing automated
process improves the completeness and accuracy of indicator data reports. In
addition, the results validate that automating reporting reduces the
facility’s human resource burden by eliminating manual data entry, which is
time and labor intensive and prone to human error [18].The automated reporting process tasks required one staff member to complete one task,
while the manual reporting process required at least one staff for each of the three
activities. Often staff are pulled away from their health care-related
responsibilities to prepare reports. As data demand increases, high-volume sites may
not be able to sustain the human resources needed to support manual data reporting.
With automated reporting, existing staff can generate and send reports from the EMR
to the national HMIS, eliminating the need for additional staff. Surprisingly, this
study discovered that facility staff were bypassing KenyaEMR and spending more than
half a workday (5.75 hours) aggregating reporting data from paper records by hand
each month. While out of scope for this study, this indicates a need for further
investigation.Automated entry improved timely availability and quality of indicator data,
consistent with studies on automated entry of surveillance data [17,18]. While
aggregate data was complete using the automated process, data completeness in the
manual system was only about two–thirds. Not only was the data accuracy much
higher for automated data entry than for manual data entry, there was also a
substantial difference in the time required to generate and submit data between the
manual and automated reporting processes. This is fundamental to both monitoring
progress toward meeting performance targets, planning, and resource allocation, and
identifying areas needing additional support to improve health outcomes and
impact.While there is a great promise with indicator reporting automation, a number of
issues need to be addressed to ensure successful implementation. During the
transition from paper-based systems, some patients’ data or records may not
be in the EMR [27,28]. This requires proper planning to ensure that key data for
automated reporting is in the EMR or the ability to coordinate reporting using
different methods. Procedures for routine data quality assurance and audits in the
EMR are necessary to ensure data exchanged is of acceptable quality. Staff will
require training and active engagement to adjust to new work practices and
workflows. Robust information technology infrastructure (including reliable electric
power, adequate computers and Internet access) and support at the facility are
critical to ensure consistent EMR availability. In addition, staff should actively
identify and share lessons learned and best practices with other facilities, MOH,
and funders to help improve use of the EMR and system interoperability [29].Furthermore, there is a need to assess feasibility of automating indicator data
reporting from other EMRs used in resource-limited settings to national HMIS. This
will consolidate information to guide development of standardized indicator data
reporting, as well as identify the best approaches to support scale-up of electronic
health information systems with available resources.Limitations: Our study had a few limitations. We were not able to
observe and record the time taken to review reports before data entry into DHIS2,
which would have provided additional information when comparing the manual and
automated reporting processes. Therefore, we excluded the time to taken to review
indicator data reports on all processes to ensure consistency. While quality of data
in the EMR is important, in this study we defined data quality as the completeness
and accuracy of aggregate data transmitted from KenyaEMR to DHIS2 (e.g., the data
sent was the same as the data received.) The quality of the patient-level data used
to calculate the indicator was not assessed. We assumed that the study site
experienced similar data quality challenges reported in other studies conducted in
resource-limited settings [16,30,31]. In addition, only one facility was used in
this study, which limits our ability to generalize the results widely. This effect
was minimized by selecting a facility that is typical of HIV care and treatment
facilities in Kenya and uses the same indicator reporting process. The large number
of patients enrolled at our study site enabled us make observations that would be
expected in other busy, yet understaffed health facilities.
Conclusion
This study demonstrates that sending indicator data automatically from a health
facility EMR (based on OpenMRS) to the national-level reporting system (DHIS2) is
both possible beneficial. It eliminates need for manual data entry that can
introduce transcription errors and reduces delays, thus improving indicator data
completeness and accuracy for use at the facility, subnational, and national levels.
It also reduces the amount of time to prepare and submit indicator data and the
number of facility staff required to fulfil reporting requirements at health
facilities, which is key to scale up of HIS without the need for additional human
resources.There is potential to increase indicator data completeness, accuracy and availability
in the national HMIS. Additionally, increasing the focus on automated indicator data
reporting may facilitate the development of internationally recognized data exchange
standards for aggregate data, which is fundamental to monitoring global health
outcomes and impact. Further studies should be conducted on the effect of the use of
data exchange standards for automated reporting using different EMRs on data quality
and timeliness.
Authors: W Mphatswe; K S Mate; B Bennett; H Ngidi; J Reddy; P M Barker; N Rollins Journal: Bull World Health Organ Date: 2011-12-05 Impact factor: 9.408
Authors: Margaret Chan; Michel Kazatchkine; Julian Lob-Levyt; Thoraya Obaid; Julian Schweizer; Michel Sidibe; Ann Veneman; Tadataka Yamada Journal: PLoS Med Date: 2010-01-26 Impact factor: 11.069
Authors: Tom Oluoch; Xenophon Santas; Daniel Kwaro; Martin Were; Paul Biondich; Christopher Bailey; Ameen Abu-Hanna; Nicolette de Keizer Journal: Int J Med Inform Date: 2012-08-24 Impact factor: 4.046
Authors: Wafaa M El-Sadr; Charles B Holmes; Peter Mugyenyi; Harsha Thirumurthy; Tedd Ellerbrock; Robert Ferris; Ian Sanne; Anita Asiimwe; Gottfried Hirnschall; Rejoice N Nkambule; Lara Stabinski; Megan Affrunti; Chloe Teasdale; Isaac Zulu; Alan Whiteside Journal: J Acquir Immune Defic Syndr Date: 2012-08-15 Impact factor: 3.731
Authors: Felix Bahati; Jacob Mcknight; Fatihiya Swaleh; Rose Malaba; Lilian Karimi; Musa Ramadhan; Peter Kibet Kiptim; Emelda A Okiro; Mike English Journal: PLoS One Date: 2022-04-08 Impact factor: 3.752
Authors: Jieun Lee; Caroline A Lynch; Lauren Oliveira Hashiguchi; Robert W Snow; Naomi D Herz; Jayne Webster; Justin Parkhurst; Ngozi A Erondu Journal: BMJ Glob Health Date: 2021-06