| Literature DB >> 28990921 |
Christine L Hershey1, Achuyt Bhattarai2, Lia S Florey3, Peter D McElroy2, Carrie F Nielsen2, Yazoume Yé4, Erin Eckert1, Ana Cláudia Franca-Koh4, Estifanos Shargie5, Ryuichi Komatsu5, Paul Smithson6, Julie Thwing2, Jules Mihigo7, Samantha Herrera4, Cameron Taylor3, Jui Shah4, Eric Mouzin8, Steven S Yoon2, S René Salgado1.
Abstract
As funding for malaria control increased considerably over the past 10 years resulting in the expanded coverage of malaria control interventions, so did the need to measure the impact of these investments on malaria morbidity and mortality. Members of the Roll Back Malaria (RBM) Partnership undertook impact evaluations of malaria control programs at a time when there was little guidance in terms of the process for conducting an impact evaluation of a national-level malaria control program. The President's Malaria Initiative (PMI), as a member of the RBM Partnership, has provided financial and technical support for impact evaluations in 13 countries to date. On the basis of these experiences, PMI and its partners have developed a streamlined process for conducting the evaluations with a set of lessons learned and recommendations. Chief among these are: to ensure country ownership and involvement in the evaluations; to engage stakeholders throughout the process; to coordinate evaluations among interested partners to avoid duplication of efforts; to tailor the evaluation to the particular country context; to develop a standard methodology for the evaluations and a streamlined process for completion within a reasonable time; and to develop tailored dissemination products on the evaluation for a broad range of stakeholders. These key lessons learned and resulting recommendations will guide future impact evaluations of malaria control programs and other health programs.Entities:
Mesh:
Year: 2017 PMID: 28990921 PMCID: PMC5619934 DOI: 10.4269/ajtmh.17-0064
Source DB: PubMed Journal: Am J Trop Med Hyg ISSN: 0002-9637 Impact factor: 2.345
Figure 1.Framework for conducting malaria impact evaluations. Impact evaluations of the malaria control programs generally take 12 months to conduct and can be divided into three phases: initiation, execution, and finalization of the evaluation. Specific activities occurring in each phase are shown. Stakeholder engagement occurs throughout the evaluation, with specific groups of stakeholders brought in at various times.
Challenges and recommendations for conducting malaria impact evaluations
| Issue/challenge | Recommendations |
|---|---|
| Overarching issues | |
| Need to generate buy-in and ensure transparency | • Involve all relevant stakeholders throughout the evaluation process |
| Need to involve many stakeholders but also keep the process streamlined | • Designate three groups of stakeholders with clear roles and responsibilities throughout the evaluation: steering committee, broad stakeholder group, core evaluation team |
| • Engage the broad stakeholder group when needed (e.g., presentation of findings), but rely on smaller groups of stakeholders (e.g., technical working groups) to focus on a particular technical area | |
| Evaluations often take more than a year, making it difficult to maintain stakeholder engagement | • Make every effort to conduct the evaluation within a one-year time frame |
| • Generate buy-in and reach an agreement on the importance of the evaluation at the outset | |
| Potential conflict of interest when the evaluations are led by the NMCP and/or partners | • Consider hiring an independent evaluation team |
| • Agreement to release the findings whether there is demonstrated impact or not | |
| Initiation phase | |
| Determining when it is appropriate to conduct an impact evaluation | • Discuss with the NMCP, funding agencies and steering committee before commencing the evaluation |
| • Consider epidemiological context, available quality data, timing and extent of malaria control intervention scale-up (see decision tree, | |
| • Consider postponing the evaluation or conducting a program review or coverage assessment if the conditions are not met for a national-level impact evaluation | |
| Identification of relevant data sources | • Comprehensive mapping and assessment of existing data, and access to data sources should be discussed prior to and during initial broad stakeholder meetings |
| Multiple partners or funding agencies interested in conducting an evaluation during the same time period can burden the country or cause confusion | • Where possible, partners should collaborate to conduct one joint evaluation |
| • Where more than one evaluation makes sense, partners should share findings and coordinate the dissemination of the evaluations | |
| Need for financial and human resources for conducting the evaluation | • At the outset partners should agree to their level of funding or in-kind contributions to the evaluation |
| • Include funding for any capacity strengthening activities and the final dissemination (e.g., event, printing of full reports or key findings reports) | |
| Execution phase | |
| Identifying the right combination of partners and particularly the lead partner for the evaluation | • Identify a lead for the evaluation with the NMCP (and steering committee) |
| • Consider the capacity and time availability of the NMCP, local and external technical partners when selecting a lead | |
| • Bring in additional technical assistance as needed | |
| • Designate a dedicated core evaluation team to conduct the evaluation and focus on the day to day activities | |
| Finalization phase | |
| Stakeholder disagreement over results or conclusions | • Provide ample opportunity for review and discussion |
| • If there are contradictory findings conclude that results are inconclusive with regards to a particular aspect of the evaluation | |
| • The stronger the methods are and the less room for interpretation should reduce disagreement | |
| Evaluation clearance was time consuming | • Establish clear procedures for clearance of the evaluation reports and associated documents at the outset of the evaluation |
| Generating evaluation products that will be of use to multiple stakeholders | • Generate multiple products from the evaluation including a full core report with annexes, key findings report, journal article(s), policy brief, etc. |
| Need to make the impact evaluation results useful to the NMCP and partners | • As part of the dissemination activities an action plan should be developed by all partners to address intervention coverage gaps |
| • Address data gaps in the action plan, which will improve the evaluation process in the future, including planning for a prospective evaluation | |
| The final documents need to be available in both the official local language (e.g., French or Portuguese) and English | • Plan time and funding for translation services |
| • Provide French (or Portuguese) versions of analysis plans, protocols, and report outlines | |
| • Consider producing the key findings report in the official language and English, but the full core report only in the official language | |
NMCP = national malaria control program.
Figure 2.Malaria impact evaluation decision process. When deciding when it is appropriate to conduct a national-level impact evaluation several considerations need to be taken into account including epidemiological context, intervention coverage levels, time since intervention scale-up began, time intervention coverage levels have been sufficiently high, and time since last evaluation. Several alternatives to a full national-level impact evaluation are provided. This decision process is applicable for high burden countries using all cause childhood mortality as the primary impact measure. DHS, Demographic and Health Survey; IE, impact evaluation; IGME, UN Inter-agency Group for Child Mortality Estimation; IHME, Institute for Health Metrics and Evaluation; MICS, Multiple Indicator Cluster Survey.
Suggested timeline for conducting malaria impact evaluations
| Activity | Estimated time (week) | Partners involved |
|---|---|---|
| Initiation phase | ||
| Start discussions with NMCP, in-country and international stakeholders | 4 | Funding agencies, PMI in-country teams, national authorities, steering committee |
| Contract local and external technical partners and identify any remaining members of core evaluation team and steering committee, agree on TOR (including facilitating data access) | 6 | Funding agencies, core evaluation team, steering committee |
| Develop work plan, analysis plan, report outline, and task matrix | 2 | Core evaluation team |
| Kick off the evaluation with a stakeholder meeting | 1 | Core evaluation team, funding agencies, national authorities, steering committee, broad stakeholder group |
| Gain access to data sets | 2 | Core evaluation team, broad stakeholder group |
| Execution phase | ||
| Conduct preliminary analyses | 4 | Core evaluation team, discussions with technical working groups |
| Complete remaining analyses | 4 | Core evaluation team |
| Develop complete draft report | 6 | Core evaluation team |
| Evaluation team and identified stakeholders to review draft report | 3 | Core evaluation team, identified stakeholders |
| Convene consultative meeting to present the preliminary results | 1 | Broad stakeholder group, evaluation team, steering committee |
| Develop final draft report, incorporating feedback | 4 | Core evaluation team |
| Finalization phase | ||
| Broad review of the final draft, allow external reviewers to comment on report | 4 | Broad group of stakeholders and any additional external reviewers |
| Complete final edits, proofreading, and formatting | 3 | Evaluation team, editor, proofreader, graphic designer |
| Final approval (in-country and funding agency clearance) and make necessary revisions | 8 | Funding agencies, national authorities, core evaluation team |
| Hold dissemination event to share findings | 1 | Evaluation team, funding agencies, national authorities, all stakeholders |
NMCP = national malaria control program; PMI = President’s Malaria Initiative; TOR = terms of reference.
Modified from Roll Back Malaria Monitoring and Evaluation Reference Group impact evaluation framework[2].
Sample malaria impact evaluation budget
| Sample budget for conducting a malaria impact evaluation | |
|---|---|
| Activity | Cost (USD) |
| External (international) technical partner | 145,000–175,000 |
| Analysis of household survey data | |
| Analysis of additional data | |
| Report writing | |
| Management of the evaluation process | |
| Local technical partner | 70,000–140,000 |
| Compilation of data in country | |
| Writing background and intervention sections of the report | |
| Data analysis | |
| Management of the evaluation process | |
| Stakeholder meetings | 5,000–10,000 |
| Access and analysis of meteorological data | 0–50,000 |
| Translation services | 0–15,000 |
| Printing report (or key findings report) and dissemination meeting | 10,000–15,000 |
| Total | 230,000–405,000 |
This is a sample budget for evaluations conducted with local and external technical partners. Salaries of national malaria control programs, funding agency staff and other reviewers working on the evaluation are not included.
Depending on who is managing the evaluation and conducting the analyses, some costs may shift between the external and local technical partners.