| Literature DB >> 31699095 |
Heather McKay1,2, Patti-Jean Naylor3, Erica Lau4,5, Samantha M Gray4, Luke Wolfenden6,7, Andrew Milat8,9, Adrian Bauman9, Douglas Race4, Lindsay Nettlefold4, Joanie Sims-Gould4,5.
Abstract
BACKGROUND: Interventions that work must be effectively delivered at scale to achieve population level benefits. Researchers must choose among a vast array of implementation frameworks (> 60) that guide design and evaluation of implementation and scale-up processes. Therefore, we sought to recommend conceptual frameworks that can be used to design, inform, and evaluate implementation of physical activity (PA) and nutrition interventions at different stages of the program life cycle. We also sought to recommend a minimum data set of implementation outcome and determinant variables (indicators) as well as measures and tools deemed most relevant for PA and nutrition researchers.Entities:
Keywords: Dissemination; Exercise; Healthy eating; Implementation science; Public health; Scalability
Mesh:
Year: 2019 PMID: 31699095 PMCID: PMC6839114 DOI: 10.1186/s12966-019-0868-4
Source DB: PubMed Journal: Int J Behav Nutr Phys Act ISSN: 1479-5868 Impact factor: 6.457
Fig. 1The focus of implementation evaluation along the scale-up continuum
Implementation and scale-up frameworks and process models that surfaced most often
| Implementation | |
1. Framework for Effective Implementation [ 2. Consolidated Framework for Implementation Research (CFIR) [ 3. Dynamic Sustainability Framework [ | |
| Scale-Up | |
1. Scaling Up Health Service Innovations - A Framework for Action [ 2. 3. | |
1. Steps to Developing a Scale-Up Strategy [ 2. Review of Scale-Up/Framework for Scaling Up Physical Activity Interventions [ 3. A Guide to Scaling Up Population Health Interventions [ | |
1. Diffusion of Innovations [ 2. Conceptual Model for the Spread and Sustainability of Innovations in Service Delivery and Organization [ |
Note: Additional resources recommended by experts who participated in rounds 4 and 5 of the modified Delphi process are bolded
The 25 most highly ranked indicators reported by those who participated in Delphi Rounds 1–4
| Indicators | Ranking | |
|---|---|---|
| Round 1–3 | Round 4 | |
| Acceptability | 1 | 4–11 |
| Adoption | 2 | 3 |
| Adaptability/adaptation | 3 | 25 |
| Barriers | 4 | 19–22 |
| Context | 5 | 4–11 |
| Implementation | 6 | 4–11 |
| Feasibility | 7 | 12–16 |
| Dose delivered (completeness) | 8 | 17–18 |
| Reach | 9 | 1 |
| Dose received (exposure) | 10 | 4–11 |
| Adherence | 11–15 | 12–16 |
| Appropriateness | 11–15 | 23–24 |
| Cost | 11–15 | 2 |
| Effectiveness | 11–15 | 4–11 |
| Fidelity | 11–15 | 4–11 |
| Culture | 16–20 | 12–16 |
| Dose (satisfaction) | 16–20 | 19–22 |
| Maintenance | 16–20 | 19–22 |
| Recruitment | 16–20 | 19–22 |
| Sustainability | 16–20 | 4–11 |
| Complexity | 21–25 | 23–24 |
| Dose | 21–25 | 12–16 |
| Efficacy (of interventions) | 21–25 | 12–16 |
| Innovation characteristics | 21–25 | 23–24 |
| Self-efficacy | 21–25 | 17–18 |
A minimum data set of implementation outcomes and determinants
| Implementation outcomes | Delivery of the intervention | Delivery of implementation strategies |
| Definitions | Definitions | |
| 1. Adoption | Proportion and representativeness of providers or the delivery team* that deliver an intervention [ | Proportion and representativeness of the support system* that utilize implementation strategies. |
2. Dose delivered (dose) | Intended units of each intervention component delivered to participants by the delivery team [ | Intended units of each implementation strategy delivered by the support system. |
| 3. Reach | Proportion of the intended priority audience (i.e., participants) who participate in the intervention [ | Proportion of the intended priority populations (organizations and/or participants) that participate in the intervention. |
4. Fidelity (adherence) | The extent to which an intervention is implemented as it was prescribed in the intervention protocol - by the delivery team [ | The extent to which implementation strategies are implemented as prescribed in the scale-up plan - by the support system. |
5. Sustainability (maintenance) | Whether an intervention continues to be delivered and/or individual behaviour change is maintained; intervention and individual behaviour change may evolve or adapt with continued benefits for individuals after a defined period of time [ | Whether implementation strategies continue to be delivered and/or behaviour change at the system level are maintained; implementation strategies and behaviour change at the system level may evolve or adapt with continued benefits for systems after a defined period of time. |
| Implementation determinants | Delivery of the intervention | Delivery of the implementation strategy |
| 1. Context | Aspects of the larger social, political, and economic environment that may influence intervention implementation [ | Aspects of the larger social, political, and economic environment that may influence delivery of the implementation strategies |
| 2. Acceptability | Perceptions among the delivery team that a given intervention is agreeable, palatable, or satisfactory [ | Perceptions among the support system that implementation strategies are agreeable, palatable, or satisfactory. |
| 3. Adaptability | Extent to which an intervention can be adapted, tailored, refined, or reinvented to meet local needs [ | Extent to which implementation strategies can be adapted, tailored, refined, or reinvented to meet the needs of organizations at scale-up. |
| 4. Feasibility | Perceptions among the delivery team that an intervention can be successfully used or carried out within a given organization or setting [ | Perceptions among the support system that implementation strategies can be successfully used or carried out at scale within different organizations or settings. |
5. Compatibility (appropriateness) | Extent to which an intervention fits with the mission, priorities, and values of organizations or settings [ | Extent to which implementation strategies fit with the mission, priorities, and values of organizations at scale-up. |
| 6. Cost | Money spent on design, adaptation and implementation of an intervention [ | Money spent on design, adaptation and delivery of implementation strategies. |
| 7. Culture | Organizations’ norms, values, and basic assumptions around selected health outcomes [ | Organizations’ norms, values, and basic assumptions around selected implementation strategies. |
8. Dose (satisfaction) | Delivery team’s satisfaction with an intervention and with interactions with the support system [ | Support system’s satisfaction with implementation strategies. |
| 9. Complexity | Perceptions among the delivery team that a given intervention is relatively difficult to understand and use; number of different intervention components [ | Perceptions among the support system that implementation strategies are relatively difficult to understand and use; number of different strategies. Related to implementation setting. |
| 10. Self-efficacy | Delivery team’s belief in its ability to execute courses of action to achieve implementation goals [ | Support system’s belief in its ability to execute courses of action to achieve implementation goals. |
Note: Indicators are defined as to whether they assess delivery of the intervention to participants (by delivery partners) OR to delivery of implementation strategies at the organizational level (by those that comprise a support system). Where similar terms were collapsed, the term preferred by the expert group is numbered while the synonymous term is bracketed. Several indicators were grouped because they had similar or shared definitions (dose delivered/dose; compatibility and appropriateness; sustainability and maintenance; dose/satisfaction). Four indicators were excluded from the tables based on the opinion of the expert group that participated in rounds 4 and 5: implementation, recruitment, efficacy (of interventions) and effectiveness (of interventions)