Literature DB >> 25413722

Toward utilization of data for program management and evaluation: quality assessment of five years of health management information system data in Rwanda.

Marie Paul Nisingizwe1, Hari S Iyer2, Modeste Gashayija3, Lisa R Hirschhorn4, Cheryl Amoroso5, Randy Wilson6, Eric Rubyutsa3, Eric Gaju3, Paulin Basinga7, Andrew Muhire3, Agnès Binagwaho8, Bethany Hedt-Gauthier9.   

Abstract

BACKGROUND: Health data can be useful for effective service delivery, decision making, and evaluating existing programs in order to maintain high quality of healthcare. Studies have shown variability in data quality from national health management information systems (HMISs) in sub-Saharan Africa which threatens utility of these data as a tool to improve health systems. The purpose of this study is to assess the quality of Rwanda's HMIS data over a 5-year period.
METHODS: The World Health Organization (WHO) data quality report card framework was used to assess the quality of HMIS data captured from 2008 to 2012 and is a census of all 495 publicly funded health facilities in Rwanda. Factors assessed included completeness and internal consistency of 10 indicators selected based on WHO recommendations and priority areas for the Rwanda national health sector. Completeness was measured as percentage of non-missing reports. Consistency was measured as the absence of extreme outliers, internal consistency between related indicators, and consistency of indicators over time. These assessments were done at the district and national level.
RESULTS: Nationally, the average monthly district reporting completeness rate was 98% across 10 key indicators from 2008 to 2012. Completeness of indicator data increased over time: 2008, 88%; 2009, 91%; 2010, 89%; 2011, 90%; and 2012, 95% (p<0.0001). Comparing 2011 and 2012 health events to the mean of the three preceding years, service output increased from 3% (2011) to 9% (2012). Eighty-three percent of districts reported ratios between related indicators (ANC/DTP1, DTP1/DTP3) consistent with HMIS national ratios. Conclusion and policy implications: Our findings suggest that HMIS data quality in Rwanda has been improving over time. We recommend maintaining these assessments to identify remaining gaps in data quality and that results are shared publicly to support increased use of HMIS data.

Entities:  

Keywords:  Rwanda; data quality; data use; global health; health management information system; quality improvement

Mesh:

Year:  2014        PMID: 25413722      PMCID: PMC4238898          DOI: 10.3402/gha.v7.25829

Source DB:  PubMed          Journal:  Glob Health Action        ISSN: 1654-9880            Impact factor:   2.640


National health data are required for planning and evaluation of service delivery (1–3) . This planning and evaluation is critical in developing countries where the majority of health services are provided through national programs and the limited funds must be used efficiently and effectively (1–4) . In these settings, high data quality is important to ensure that decisions reflect program needs and direct health professional education priorities (2–6) . Poor data quality not only contributes to poor decisions and loss of confidence in the systems, but also threatens the validity of impact evaluation studies (7). In most countries, health management information systems (HMISs) serve as the primary data source for national health planning and evaluation (2, 4). However, existing evidence suggests variable and often poor quality of this data (7–15) . In 2009, the World Health Organization (WHO) shared a framework for assessing data quality of HMIS through checks of completeness, internal consistency and external consistency (16), offering countries a way to measure data quality and identify gaps. The Rwanda Ministry of Health (MoH) introduced an electronic-based HMIS in 2008. Given an established electronic system, there is an opportunity to use HMIS data for evaluation purposes and policy making in Rwanda. It can also provide national-level estimates as representative surveys are expensive and can only be done after 3–5 years, and they do not necessarily provide estimates at the lowest catchment area of service delivery (17). While examples exist of interventions conducted in Rwanda to improve HMIS data quality (18–20) , no formal assessment of quality of Rwanda HMIS data exists. The purpose of this study is to assess the quality of the Rwanda HMIS data from 2008 to 2012.

Methods

Rwanda National HMIS

Prior to 2008, the Rwanda HMIS existed almost entirely in paper form. Rwanda began using an electronic HMIS in 2008 to capture facility healthcare data. Indicators collected include service uptake data for key programs (e.g. immunization, family planning, and antenatal care) and general health systems data (e.g. drug availability and financial information). Patient-level data are recorded in paper-based registers by care providers. Data are aggregated at the facility-level and monthly reports are submitted to the district team. Prior to 2012, reports were then forwarded to the central MoH office and imported into an electronic system. Since 2012, MoH introduced a web-based system (DHIS2) allowing data entry to be done at the facility. This system allows data to be stored centrally, and the facility to maintain and view their data from a local database. In 2012, there were 922 health facilities in Rwanda, 748 (81%) of which were public. The remaining 174 (19%) were private.

WHO data quality report card

Noting the importance of HMIS data with regards to national and sub-national health sector planning, the WHO introduced the data quality report card framework (16). This framework provides standardized methods for assessing data quality in different low-income settings around the world, and outlines a series of checks that can be conducted quickly to identify inconsistencies in national HMIS systems.

Data and analysis

Data were extracted from Rwanda's national HMIS database covering all facility reports from January 2008 through December 2012. Using the WHO report card framework (16), we assessed the data quality of the 495 publicly funded health facilities that were open for the duration of the reporting period. The assessment focused on two dimensions of quality: completeness and internal consistency of reported data. Ten indicators were included in the assessment, selected based on WHO recommendations and priority areas for the national health sector (Table 1).
Table 1

List of indicators included in the HMIS data quality

Indicators
I1ANC1New ANC registration
I2ANC4Women who completed four ANC standard visits
I3OPDOutpatient visits
I4DeliveriesTotal deliveries
I5FPWomen who used family planning at the end of the month
I6RiskreferNumber of patients referred to hospitals
I7DTP1Children who received diphtheria–pertussis–tetanus first dose
I8DTP2Children who received diphtheria–pertussis–tetanus second dose
I9DTP3Children who received diphtheria–pertussis–tetanus third dose
I10U5visitNumber of under the age of five children visits
List of indicators included in the HMIS data quality

Completeness of reported data

Completeness of reporting at health facility and completeness of indicator data in a report were measured on indicators 1–10 (Table 1).

Completeness of facility reporting

At the national level, completeness of facility reporting was measured as the number of monthly reports received divided by the expected number of reports in a given year (12×number of health facilities reporting that year). At district level, the proportion of districts that have facility reporting rates below 80% was calculated. These districts are considered to have poor reporting.

Completeness of indicator data

Completeness of indicator data was measured as percentage of values that are not missing values for key indicators. At the national level, this percentage is calculated by summing all the non-missing values across key indicators for a specified period of time and dividing by the expected number (12 months×30 districts×10 indicators). A district was considered to have incomplete indicator reporting if it reported more than 20% of missing values across 10 indicators.

Internal consistency of reported data

Extreme and moderate outliers for indicators 1–10, trends over time for indicators 1, 3, 4, and 9, and internal consistency of I7 (compared to I1) and I9 (compared to I7) were examined.

Moderate and extreme outliers

Moderate outliers were defined as monthly values that were at least ±2 standard deviations from the average value of the indicator for a given district for a specified period of time. Extreme outliers were at least ±3 standard deviations.

Internal consistency between indicators

Consistency between new Antenatal Care registration (ANC1) and Diphtheria–Pertussis–Tetanus first dose (DTP1) was measured by calculating a DTP1/ANC1 ratio for each district. These ratios were recommended by the WHO framework because the indicators in each ratio are expected to track one another. If the district ratio was 33% different from the national ratio, it was considered to be inconsistent. Consistency between DTP1 and Diphtheria–Pertussis–Tetanus third dose (DTP3) was calculated by dividing total number of DTP3 by the total number of DTP1 for each district. Percentage of districts that have DTP3 immunizations number that are 2% or higher than DTP1 which is a marker of inconsistent were reported.

Consistency over time

The check for consistency over time calculated the ratio of the reported values in 2011 and 2012 for a specific indicator to the mean value of the same indicator for the previous 3 years combined. At the subnational level, this indicator looks at the percentage of districts with at least 33% difference between their ratio and the national ratio, a marker of inconsistency.

Results

Completeness of facility reporting increased from 2008 to 2012 (Table 2). Seven percent of districts in 2008 reported a completeness rate below 80%, which decreased to 0% in 2012. Completeness of indicator data increased over time from 88% in 2008 to 95% in 2012 (p<0.0001). The proportion of districts with >20% missing values decreased from 7% in 2008 to 0% in 2012.
Table 2

Completeness of facility reporting and indicator data (2008–2012)

2008 (%)2009 (%)2010 (%)2011 (%)2012 (%)
National district completeness rate959998100100
Districts with completeness rate below 80%70000
Completeness of indicator data8891899095
Proportion of district with more than 20% missing values70330
Completeness of facility reporting and indicator data (2008–2012) At the national level, the percentage of moderate and extreme outliers was 0% across all years (Table 3). At the sub-national level, no districts reported >5% monthly values that were extreme or moderate outliers. At the facility level, the mean percent of outliers was 4% (2008, 2009, 2010 and 2011) and 3% in 2012. Extreme outliers were found only in 2012 (3%). In 2008, 10% of districts had DTP1/ANC1 ratios above the national ratio. This percentage decreased to 0% in 2012. In 2009, 13% of districts had DTP1/ANC1 ratio below the national ratio, which decreased to 0% in 2012. The percentage of districts where the DTP3/DTP1 ratio was >2% was high in 2009 (17%) and 2012 (23%).
Table 3

Outliers and internal consistency between indicators (2008–2012)

20082009201020112012
Extreme and moderate outliers
Proportion of values that are moderate outliersa 0%0%0%0%0%
Proportion of values that are extreme outliersa 0%0%0%0%0%
Internal consistency between DTP1 and ANC1
National DTP1/ANC1 ratio0.870.970.870.900.94
Proportion of districts with DTP1/ANC1 ratio 33% above national ratio10%%0%0%0%
Proportion of districts with DTP1/ANC1 ratio 33% below national ratio0%13%0%0%0%
Internal consistency between DTP1 and DTP3
National DPT3/DTP1 ratio0.961.001.010.970.99
Proportion of districts where DTP3 is 2% greater than DTP113%17%3%0%23%

Numerator=sum of occurrences of outliers [±2 (3) SD] over the 12 months for the 10 indicators; Denominator=120 (number of health facilities×12 months×10 indicators).

Outliers and internal consistency between indicators (2008–2012) Numerator=sum of occurrences of outliers [±2 (3) SD] over the 12 months for the 10 indicators; Denominator=120 (number of health facilities×12 months×10 indicators). Table 4 shows the consistency over time ratios for 2011 and 2012. There was a 21% increase in reported deliveries in 2011 compared to the mean of three preceding years, with a 14% increase in 2012. For the outpatient department visit ratio, there was a 10% decrease in 2011 and a 13% increase in 2012. For all other indicators, the change was minimal.
Table 4

Consistency over time: national ratio of total number of events in the current year to mean number of events in preceding 3 years

20112012
ANC1 ratio1.021.02
Deliveries ratio1.211.14
DTP3 ratio1.001.05
OPD ratio0.901.13
Proportion of district with 33% difference between their ratio and national ratio0%10%
Consistency over time: national ratio of total number of events in the current year to mean number of events in preceding 3 years

Discussion

Overall, our data quality assessment suggests high and increasing completeness of reporting and internal consistency of the Rwanda HMIS data. The improvement is likely attributable to interventions implemented in the country by the Rwandan government and non-government organizations to strengthen health systems and improve data quality. Performance Based Financing (PBF) (21), introduced in 2010, is one such intervention that may have contributed to improved data quality Since HMIS reports provide data that guided incentives payments for PBF, the MoH established rigorous quality checks of the HMIS data by district supervisors as part of their formative monthly supervision (19, 20, 22). Change in technology from locally based system to a web based system, and trainings on how to use the system and data cleaning done at health facility have also highly contributed to this improvement. This is important because Rwanda's HMIS data is a data source for local, national and international policy-makers and demonstrating high data quality may encourage the use of this data more broadly (17). While we found improvement in completeness, other metrics identified potential data challenges. We found deviations in the consistency over time measures for deliveries and OPD visits. While these findings may indicate poor data quality, they could be explained by increased uptake of services (21–23) . An increased DPT3/DPT1 ratio could result from migration within a district where the number of children eligible for DPT3 increases or decreases or if more vaccines were given at the beginning or end of a year. Our results contrast with the other published assessment of HMIS using the WHO report card framework in sub-Saharan Africa, where they found poor data quality (24). They also differ with most results of different assesments of facility data quality, which also found gaps in data quality pointing to a need for improvement (7–15) . Another study from Mozambique, using a Global Fund methodology, also found high quality for assessed indicators (3). Our analysis has limitations. First, private health facilities were excluded. In 2012, private facilities accounted for 19% of all facilities in Rwanda and accounted for an estimated 11–15% of health service delivery (A. Muhire, personal communication, October 16, 2014). Private facilities only started reporting to HMIS in 2012, and due to the difference in implementation time between public and private facilities, we felt they should be analyzed separately. Second, although chosen a priori based on WHO recommendations and priority areas for the health sector, we only assessed the quality of 10 indicators captured in the HMIS, limiting our ability to comment on representativeness of quality for the whole system. Finally, we did not assess reliability (consistency between paper registers at facilities) and accuracy (consistency between actual healthcare utilization at facilities and electronic reports) of Rwanda's HMIS data. Previous studies in Rwanda have looked at data reliability of the HMIS reports from community health workers as compared to register data. These studies found poor reliability of aggregated reports as compared to individual patient data (6). However, the bias was not systematically over- or under-reported and suggested that in aggregate, the errors might cancel out. Our analysis demonstrates the feasibility of conducting a national assessment of HMIS data quality using the WHO data quality report card framework in a developing country. Since all of the indicators we studied are reported on a monthly basis to an electronic system, these methods can be replicated to provide routine monthly evaluations of HMIS completeness and internal consistency. We recommend maintaining and expanding these assessments for timely identification of HMIS data quality gaps and that all sub-Saharan African countries, including Rwanda, integrate these assessments into routine practice. We believe that routine assessments will lead to overall quality improvement of HMIS data and that this will encourage data use of this valuable system for program management and evaluation. We also hope these findings will allow other researchers to have more confidence in using these data for effective health sector decision-making.
  15 in total

1.  Improving public health information: a data quality intervention in KwaZulu-Natal, South Africa.

Authors:  W Mphatswe; K S Mate; B Bennett; H Ngidi; J Reddy; P M Barker; N Rollins
Journal:  Bull World Health Organ       Date:  2011-12-05       Impact factor: 9.408

2.  Design and implementation of a health management information system in Malawi: issues, innovations and results.

Authors:  Chet N Chaulagai; Christon M Moyo; Jaap Koot; Humphrey B M Moyo; Thokozani C Sambakunsi; Ferdinand M Khunga; Patrick D Naphini
Journal:  Health Policy Plan       Date:  2005-09-02       Impact factor: 3.344

3.  Assessment of data quality of and staff satisfaction with an electronic health record system in a developing country (Uganda): a qualitative and quantitative comparative study.

Authors:  S P Ndira; K D Rosenberger; T Wetter
Journal:  Methods Inf Med       Date:  2008       Impact factor: 2.176

4.  Utilizing community health worker data for program management and evaluation: systems for data quality assessments and baseline results from Rwanda.

Authors:  Tisha Mitsunaga; Bethany Hedt-Gauthier; Elias Ngizwenayo; Didi Bertrand Farmer; Adolphe Karamaga; Peter Drobac; Paulin Basinga; Lisa Hirschhorn; Fidele Ngabo; Cathy Mugeni
Journal:  Soc Sci Med       Date:  2013-03-01       Impact factor: 4.634

5.  Enhancing the routine health information system in rural southern Tanzania: successes, challenges and lessons learned.

Authors:  W Maokola; B A Willey; K Shirima; M Chemba; J R M Armstrong Schellenberg; H Mshinda; P Alonso; M Tanner; D Schellenberg
Journal:  Trop Med Int Health       Date:  2011-03-14       Impact factor: 2.622

6.  Assessing and improving data quality from community health workers: a successful intervention in Neno, Malawi.

Authors:  A J Admon; J Bazile; H Makungwa; M A Chingoli; L R Hirschhorn; M Peckarsky; J Rigodon; M Herce; F Chingoli; P N Malani; B L Hedt-Gauthier
Journal:  Public Health Action       Date:  2013-03-21

7.  Assessing immunization data quality from routine reports in Mozambique.

Authors:  João C Mavimbe; Jørn Braa; Gunnar Bjune
Journal:  BMC Public Health       Date:  2005-10-11       Impact factor: 3.295

8.  Nationwide implementation of integrated community case management of childhood illness in Rwanda.

Authors:  Catherine Mugeni; Adam C Levine; Richard M Munyaneza; Epiphanie Mulindahabi; Hannah C Cockrell; Justin Glavis-Bloom; Cameron T Nutt; Claire M Wagner; Erick Gaju; Alphonse Rukundo; Jean Pierre Habimana; Corine Karema; Fidele Ngabo; Agnes Binagwaho
Journal:  Glob Health Sci Pract       Date:  2014-08-05

9.  Prevalence and predictors of giving birth in health facilities in Bugesera District, Rwanda.

Authors:  Shahrzad Joharifard; Stephen Rulisa; Francine Niyonkuru; Andrew Weinhold; Felix Sayinzoga; Jeffrey Wilkinson; Jan Ostermann; Nathan M Thielman
Journal:  BMC Public Health       Date:  2012-12-05       Impact factor: 3.295

10.  Challenges for routine health system data management in a large public programme to prevent mother-to-child HIV transmission in South Africa.

Authors:  Kedar S Mate; Brandon Bennett; Wendy Mphatswe; Pierre Barker; Nigel Rollins
Journal:  PLoS One       Date:  2009-05-12       Impact factor: 3.240

View more
  30 in total

1.  Data quality and associated factors in the health management information system at health centers in Shashogo district, Hadiya zone, southern Ethiopia, 2021.

Authors:  Nigusu Getachew; Bereket Erkalo; Muluneh Getachew Garedew
Journal:  BMC Med Inform Decis Mak       Date:  2022-06-15       Impact factor: 3.298

2.  Perceptions on Data Quality, Use, and Management Following the Adoption of Tablet-Based Electronic Health Records: Results from a Pre-Post Survey with District Health Officers in Ghana.

Authors:  Seohyun Lee; Young-Ji Lee; SeYeon Kim; Wonil Choi; Yoojin Jeong; Nina Jinjoo Rhim; Ilwon Seo; Sun-Young Kim
Journal:  J Multidiscip Healthc       Date:  2022-07-12

3.  Practical Implications of a Relationship between Health Management Information System and Community Cohort-Based Malaria Incidence Rates.

Authors:  Simon P Kigozi; Emanuele Giorgi; Arthur Mpimbaza; Ruth N Kigozi; Teun Bousema; Emmanuel Arinaitwe; Joaniter I Nankabirwa; Catherine M Sebuguzi; Moses R Kamya; Sarah G Staedke; Grant Dorsey; Rachel L Pullan
Journal:  Am J Trop Med Hyg       Date:  2020-04-02       Impact factor: 2.345

4.  Data utilisation and factors influencing the performance of the health management information system in Tanzania.

Authors:  Leonard E G Mboera; Susan F Rumisha; Doris Mbata; Irene R Mremi; Emanuel P Lyimo; Catherine Joachim
Journal:  BMC Health Serv Res       Date:  2021-05-25       Impact factor: 2.655

5.  Small-sample inference for cluster-based outcome-dependent sampling schemes in resource-limited settings: Investigating low birthweight in Rwanda.

Authors:  Sara Sauer; Bethany Hedt-Gauthier; Claudia Rivera-Rodriguez; Sebastien Haneuse
Journal:  Biometrics       Date:  2021-01-28       Impact factor: 1.701

6.  Improving data quality across 3 sub-Saharan African countries using the Consolidated Framework for Implementation Research (CFIR): results from the African Health Initiative.

Authors:  Sarah Gimbel; Moses Mwanza; Marie Paul Nisingizwe; Cathy Michel; Lisa Hirschhorn
Journal:  BMC Health Serv Res       Date:  2017-12-21       Impact factor: 2.655

7.  Completeness of malaria indicator data reporting via the District Health Information Software 2 in Kenya, 2011-2015.

Authors:  Sophie Githinji; Robinson Oyando; Josephine Malinga; Waqo Ejersa; David Soti; Josea Rono; Robert W Snow; Ann M Buff; Abdisalan M Noor
Journal:  Malar J       Date:  2017-08-17       Impact factor: 2.979

8.  Assessing Lymphatic Filariasis Data Quality in Endemic Communities in Ghana, Using the Neglected Tropical Diseases Data Quality Assessment Tool for Preventive Chemotherapy.

Authors:  Dziedzom K de Souza; Eric Yirenkyi; Joseph Otchere; Nana-Kwadwo Biritwum; Donne K Ameme; Samuel Sackey; Collins Ahorlu; Michael D Wilson
Journal:  PLoS Negl Trop Dis       Date:  2016-03-30

9.  Impact of a district-wide health center strengthening intervention on healthcare utilization in rural Rwanda: Use of interrupted time series analysis.

Authors:  Hari S Iyer; Lisa R Hirschhorn; Marie Paul Nisingizwe; Emmanuel Kamanzi; Peter C Drobac; Felix C Rwabukwisi; Michael R Law; Andrew Muhire; Vincent Rusanganwa; Paulin Basinga
Journal:  PLoS One       Date:  2017-08-01       Impact factor: 3.240

10.  The national burden of influenza-associated severe acute respiratory illness hospitalization in Rwanda, 2012-2014.

Authors:  José Nyamusore; Joseph Rukelibuga; Mwumvaneza Mutagoma; Andrew Muhire; Alice Kabanda; Thelma Williams; Angela Mutoni; Julius Kamwesiga; Thierry Nyatanyi; Jared Omolo; Adeline Kabeja; Jean Baptiste Koama; Agrippine Mukarurangwa; Jeanne d'Arc Umuringa; Carolina Granados; Michel Gasana; Ann Moen; Stefano Tempia
Journal:  Influenza Other Respir Viruses       Date:  2017-12-02       Impact factor: 4.380

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.