| Literature DB >> 26769997 |
Simon Hales1, Ana Lesher-Trevino1, Nathan Ford2, Dermot Maher3, Andrew Ramsay3, Nhan Tran4.
Abstract
In public health, implementation research is done to improve access to interventions that have been shown to work but have not reached many of the people who could benefit from them. Researchers identify practical problems facing public health programmes and aim to find solutions that improve health outcomes. In operational research, routinely-collected programme data are used to uncover ways of delivering more effective, efficient and equitable health care. As implementation research can address many types of questions, many research designs may be appropriate. Existing reporting guidelines partially cover the methods used in implementation and operational research, so we ran a consultation through the World Health Organization (WHO), the Alliance for Health Policy & Systems Research (AHPSR) and the Special Programme for Research and Training in Tropical Diseases (TDR) and developed guidelines to facilitate the funding, conduct, review and publishing of such studies. Our intention is to provide a practical reference for funders, researchers, policymakers, implementers, reviewers and editors working with implementation and operational research. This is an evolving field, so we plan to monitor the use of these guidelines and develop future versions as required.Entities:
Mesh:
Year: 2015 PMID: 26769997 PMCID: PMC4709804 DOI: 10.2471/BLT.15.167585
Source DB: PubMed Journal: Bull World Health Organ ISSN: 0042-9686 Impact factor: 9.408
Research objectives, implementation questions and research methods
| Objective | Description | Implementation question | Research methods |
|---|---|---|---|
| Explore | Explore an idea or phenomenon to make hypotheses or generalizations from specific examples | What are the possible factors and agents responsible for good implementation of a health intervention? For enhancing or expanding a health intervention? | Qualitative methods: grounded theory, ethnography, phenomenology, case studies and narrative approaches; key informant interviews, focus groups, historical reviews |
| Quantitative: network analysis, cross-sectional surveys | |||
| Mixed methods: combining qualitative and quantitative methods | |||
| Describe | Identify and describe the phenomenon and its correlates or possible causes | What describes the context in which implementation occurs? What describes the main factors influencing implementation in a given context? | Quantitative: cross-sectional (descriptive) surveys, network analysis |
| Qualitative methods: grounded theory, ethnography, phenomenology, case studies and narrative approaches; key informant interviews, focus groups, historical reviews | |||
| Mixed methods: both qualitative and quantitative inquiry with convergence of data and analyses | |||
| Influence | Test whether an intervention produces an expected outcome | ||
| With adequacy | With sufficient confidence that the intervention and outcomes are occurring | Is coverage of a health intervention changing among beneficiaries of the intervention? | Before-after or time series in intervention recipients only; participatory action research |
| With plausibility | With greater confidence that the outcome is due to the intervention | Is a health outcome plausibly due to the implemented intervention rather than other causes? | Concurrent, non-randomized cluster trials: health intervention implemented in some areas and not in others; before-after or cross-sectional study in programme recipients and non-recipients; typical quality improvement studies |
| With probability | With a high (calculated) probability that the outcome is due to the intervention | Is a health outcome due to implementation of the intervention? | Partially controlled trials: pragmatic and cluster randomized trials; health intervention implemented in some areas and not in others; effectiveness-implementation hybrids |
| Explain | Develop or expand a theory to explain the relation between concepts, the reasons for the occurrence of events, and how they occurred | How and why does implementation of the intervention lead to effects on health behaviour, services, or status in all its variations? | Mixed methods: both qualitative and quantitative inquiry with convergence of data and analyses |
| Quantitative: repeated measures of context, actors, depth and breadth of implementation across subunits; network identification; can use designs for confirmatory inferences; effectiveness-implementation hybrids | |||
| Qualitative methods: case studies, phenomenological and ethnographic approaches with key informant interviews, focus groups, historical reviews | |||
| Participatory action research | |||
| Predict | Use prior knowledge or theories to forecast future events | What is the likely course of future implementation? | Quantitative: agent based modelling; simulation and forecasting modelling; data extrapolation and sensitivity analysis (trend analysis, econometric modelling) |
| Qualitative: scenario building exercises; Delphi techniques from opinion leaders |
Note: Table reproduced from Peters, et al.
Reporting guidelines for operational/implementation researcha
| Section | Reporting item |
|---|---|
| Identify as implementation or operational research in the title. Provide a structured summary of study context, rationale, objectives, design, methods, results and conclusions. | |
| Background | Explain the scientific background relating to both the intervention and the implementation. What is already known about the issue? |
| Problem | Briefly describe the nature and severity of the specific issue or problem that was addressed. |
| Implementation strategy | Describe mechanisms or strategies by which components were expected to cause changes, and plans for testing whether these were effective. |
| Intervention | What evidence-based intervention or innovation is proposed? |
| Intended outcomes | Describe the specific aim of the proposed study (changes/improvements in processes and outcomes). |
| Study design | Identify the study design (for example, observational, quasi-experimental, experimental, qualitative, mixed) chosen for measuring impact of the intervention on primary and secondary outcomes, (if relevant). |
| Setting | Exact details of study locations, baseline population characteristics, recruitment of participants, relevant dates for implementation, follow-up, and data collection. |
| Implementation | Give a description of the implementation strategy: frequency, duration, intensity, including how and when interventions were actually implemented, additional resources required to support implementation, mode of delivery, why and when the study ended. |
| Describe the intervention, (if relevant). The amount of detail given should be sufficient to allow replication of the study. For well-established interventions, it is sufficient to refer to previously published studies. | |
| Explain methods used to assure data quality (for example, blinding; repeating measurements and data extraction; training in data collection; collection of sufficient baseline measurements). | |
| Participants | For qualitative studies: what was the approach (e.g. ethnography, grounded theory, narrative) and theory? |
| For matched studies, give matching criteria and number of exposed and unexposed or the number of controls per case. | |
| Variables | Clearly define all outcomes, exposures, predictors, potential confounders, and effect modifiers. |
| Data sources/measurement | For each variable of interest, give sources of data and methods of assessment (or measurement). Describe sampling strategies and comparability of assessment methods if there is more than one group. |
| Analyses | Which analyses were pre-specified, and which were exploratory? |
| Ethical considerations | Including consent procedures, if relevant. How was confidentiality ensured? |
| Descriptive data | Report numbers of individuals at each stage of study – e.g. numbers eligible, included in the study, completing follow-up, and analysed. Include a flow diagram, timeline or graph, if relevant. |
| Outcomes | Explain the actual course of the intervention, if relevant. For example, describe the sequence of steps, events or phases; type and number of participants at key points, preferably using a time-line diagram or flowchart. |
| Document the degree of success in implementation: | |
| Outcome data | Report numbers of outcome events (or summary measures over time), separately for those who receive the intervention and those who do not receive it. Include summary statistics and measure of variance (SD or SE). |
| Main results | Main findings (e.g. interpretations, inferences, and themes); might include development of a theory or model, or integration with prior research or theory. |
| Other analyses | Report other analyses done – e.g. analyses of subgroups and interactions, sensitivity analyses, costs. |
| Key results | Summarize key results with reference to study objectives. |
| Limitations | Discuss limitations of the study, taking into account possible sources of confounding, bias or imprecision in design, measurement, and analysis that might have affected study outcomes (internal validity). |
| Interpretation | Interpret the results considering objectives, limitations, multiplicity of analyses, results from similar studies, and other relevant evidence. |
| Contextual factors | Success factors, barriers and how they were overcome. |
| Generalizability | Discuss the generalizability (external validity) of the study results. |
| Consider overall practical usefulness of the intervention. | |
| Indicate if the study is registered and if the data are available. | |
| Give the source of funding and the role of the funders for the present study and, if applicable, for the original study on which the present article is based. |
SD: standard deviation; SE: standard error.
a The reporting items in this table are intended to cover the wide range of study designs for implementation and operational research. As a result, not all of the items are relevant for all studies (for example, some studies will not involve the testing of an implementation strategy).
Fig. 1Flow chart illustrating the process of guideline development
Reporting guidelinesa
| Type of study | Guideline name | Example extensions |
|---|---|---|
| Randomized trials | CONSORT | TIDieR |
| Observational studies | STROBE | RECORD |
| Systematic reviews | PRISMA | PRISMA-P |
| Qualitative research | SRQR | COREQ |
| Diagnostic/prognostic studies | STARD | TRIPOD |
| Quality improvement studies | SQUIRE | |
| Economic evaluations | CHEERS | |
| Phase IV implementation studies | STaRI | |
| Policy interventions | UNTIDieR |
a Adapted from http://www.equator-network.org