| Literature DB >> 36243760 |
Rachel Flynn1, Bonnie Stevens2,3, Arjun Bains4, Megan Kennedy5, Shannon D Scott4.
Abstract
BACKGROUND: There is limited evidence to evaluate the sustainability of evidence-based interventions (EBIs) for healthcare improvement. Through an integrative review, we aimed to identify approaches to evaluate the sustainability of evidence-based interventions (EBIs) and sustainability outcomes.Entities:
Mesh:
Substances:
Year: 2022 PMID: 36243760 PMCID: PMC9569065 DOI: 10.1186/s13643-022-02093-1
Source DB: PubMed Journal: Syst Rev ISSN: 2046-4053
Inclusion and exclusion criteria
| Inclusion criteria | Exclusion criteria |
|---|---|
Articles were included if they: • Were published in a peer-review journal • Were primary research • Had explicit research design and data collection methods • Had an explicit theoretical approacha for sustainability (theoryb, modelc, frameworkd, instrumente, method, checklist, process, strategy, conceptualizations, development, pilot test and/or toolf) • Were within a healthcare settingg • Evaluated the sustainability of an EBI (Note that inconsistent terminology for EBI is prominent and thus may be referred to as a QI intervention/ initiative in the literature). | Articles were excluded if they: • Were secondary research • Were gray literature • Were outside a healthcare context (e.g., schools, social care settings) • Did not evaluate the sustainability of an EBI or QI intervention using a defined approach or, • Did not evaluate sustainability using a clear research design and method • Did not have an independent evaluation component on sustainability • Focused only on the sustainability of clinical outcome without a tangible approach • Focused only on defining or constructing concepts of sustainability |
aDefined as a process for describing and/or guiding the process of translating research into practice (process models); understanding and/or explaining what influences implementation outcomes (determinant frameworks, classic theories, implementation theories); and evaluating implementation (evaluation frameworks) [14]
bDefined as a theoretical approach in implementation science with some predictive capacity (e.g., to what extent … ?) and attempts to provide an enhanced understanding and explanation of certain aspects of implementation [14]
cDefined as a theoretical approach in implementation science commonly used to describe and/or guide the specific, step-by-step, process of translating research into practice [14]
dDefined as a theoretical approach in implementation science with a descriptive purpose [14]. Points out factors believed or found to influence implementation outcomes but does not specify the mechanisms of change [14]
eFacilitates evaluation and usually evolves as an extension of frameworks, or models, or to operationalize theories [14]
fAssists individuals on how to retrieve, comprehend, and implement research evidence [14]
gDefined as a broad array of services and places where healthcare provision occurs, including acute care hospitals, urgent care centers, rehabilitation centers, nursing homes and other long-term care facilities, specialized outpatient services (e.g., hemodialysis, dentistry, podiatry, chemotherapy, endoscopy, and pain management clinics), and outpatient surgery centers
Categories of theoretical approaches used in implementation science
| Theoretical approach [ | Definition [ |
|---|---|
| Process Models | Specify steps in the process of translating research into practice [ |
| Determinant frameworks | Classes or domains of determinants that are hypothesized or have been found to influence implementation outcomes [ |
| Classic theories | Describe how change occurs without ambitions to carry out the change [ |
| Implementation theories | Developed and adapted by researchers for potential use in implementation science to achieve enhanced understanding and explanation of certain aspects of implementation [ |
| Evaluation frameworks | Provide a structure for evaluating implementation endeavors [ |
Sustainability outcomes identified in included studies [6]
| 1. Benefits for patients, staff, and stakeholders continue. | |
| 2. Initiative activities or components of the intervention continue. | |
| 3. Maintenance of relationships, partnerships, or networks. | |
| 4. Maintenance of new procedures, and policies. | |
| 5. Attention and awareness of the problem or issue is continued or increased. | |
| 6. Replication, roll-out, or scale-up of the initiative. | |
| 7. Capacity built within staff, stakeholders, and communities continues. | |
| 8. Adaptation in response to new evidence or contextual influences. | |
| 9. Gaining further funds to continue the initiative and maintain improvements. |
Fig. 1PRISMA 2020 flow diagram of search results
Data collection method of included studies
| Data collection method | Referenced studies | |
|---|---|---|
| 19–49 | ||
| Interviews | 19–22, 24–40, 42–49 | |
| Onsite inspection + tool | 41 | |
| Steering committee minutes | 23 | |
| Workshops + field notes | 49 | |
| 50–65 | ||
| Interviews + survey/questionnaire | 50–52, 55, 58–62, 64, 65 | |
| 66–74 | ||
| Interviews + survey/questionnaire. | 69, 73 | |
| Interviews + other | 67–70, 72, 73 | |
| 75–82 | ||
| Survey/questionnaire | 75–82 |
Sustainability definition sources
| a | ||
|---|---|---|
| Shediac-Rizkallah and Bone (1998) | 5 | [ |
| National Health Services Sustainability Model (2010) | 5 | [ |
| Stirman et al. (2012) | 5 | [ |
| Scheirer and Dearing (2011) | 5 | [ |
| May et al. (2009) | 4 | [ |
| Scheirer (2005) | 4 | [ |
| Pluye et al. (2004) | 3 | [ |
| Schell, Luke, and Schooley (2013) | 2 | [ |
| Proctor et al. (2011) | 2 | [ |
| Gruen et al. (2008) | 2 | [ |
| Scheirer (2013) | 2 | [ |
| Fleiszer et al. (2015) | 2 | [ |
| Chambers et al. (2013) | 1 | [ |
| Honadle and Sante (1985) | 1 | [ |
| Organization for Economic Co-operation | 1 | [ |
| Appleby et al. (2005) | 1 | [ |
| World Health Organization (2002) | 1 | [ |
| Own definition | 3 | [ |
aTotal number is greater than 39 as some studies used multiple sources to define sustainability
Primary theoretical approaches used to evaluate sustainability
| Theoretical approach | Referenced studies | |
|---|---|---|
| [ | ||
| Single | [ | |
| Multiple | [ | |
| Own | [ | |
| [ | ||
| Single | [ | |
| Single + theory | [ | |
| Single + tool | [ | |
| Multiple + theory | [ | |
| Own | [ | |
| [ | ||
| [ | ||
| [ | ||
Primary approach for evaluation of sustainability
| Framework | Exploration, Preparation, Implementation, Sustainment (EPIS) framework [ Consolidated Framework for Implementation Research [ Promoting Action on Research Implementation in Health Services (PARiHS) [ Framework for investigating the sustainability of Antiretroviral (ARV) provision [ Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) framework [ The health technology adoption, Non-adoption, Abandonment, and challenges to Scale-up, Dynamic Sustainability Framework (DSF) [ Own framework Direction, Competency, Opportunity and Motivation (DCOM®) framework [ Framework for Sustainability of Translational Research Project [ Conceptual framework on sustainability of community-based programs [ Scheirer’s framework for program sustainability [ Conceptual framework by Pomey et al. (2009) [ Program Sustainability Framework Combination of several theoretical frameworks | ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( |
| Model | National Health Service Sustainability Model (NHS SM) [ Developed own model The British National Health Service Sustainability Index (SI) model [ Stages of Change Model [ Sustainability Pyramid Model [ Mancini and Marek’s Model of Community-Based Program Sustainability [ Kotter's 8 Steps Process for Leading Change Model [ Dynamic model of health program sustainability [ The Evidence in the Learning Organization (ELO) Model [ | ( ( ( ( ( ( ( ( ( |
| Tool | Program Sustainability Assessment Tool (PSAT) [ Sustainability Index and Dashboard Tool [ NHS Institute for Innovation and Improvement SM Self-Assessment Tool [ | ( ( ( |
| Theory | Diffusion of Innovation theory [ Normalization Process Theory (NPT) [ | ( ( |
| Instrument | Technology Adaption Readiness Scale (TARS) [ Adapted Level of Institutionalization (LoIn) scales [ Individual Placement and Support Fidelity Scale (IPS-25) [ | ( ( ( |
Study design and MMAT score of 64 included studies
| Study design | Number of studies (%) | MMAT score distribution | |||||
|---|---|---|---|---|---|---|---|
| 0 | 20 | 40 | 60 | 80 | 100 | ||
| Mixed-methods | 26.5 ( | 0 | 2 | 2 | 4 | 24 | |
| Multi-methods | 12.5 ( | 0 | 0 | 1 | 1 | 5 | |
| Qualitative | 50 ( | 0 | 1 | 4 | 6 | 6 | |
| Quantitative | 11 ( | 0 | 2 | 2 | 1 | 3 | |
Reported timing of evaluation
| Timing of evaluation | Number of studies ( | Points of data collection | Reference(s) |
|---|---|---|---|
| Pre- and post-intervention | ➔ Baseline a, 1 year post, 2 years post | [ | |
| ➔ Baseline, 6–8 months post, 36–42 months post | [ | ||
| Pre- and during intervention | ➔ Baseline, 3 years post | [ | |
| Pre-, during, and post-intervention | ➔ Baseline, 3–6 months post, 7–12 months post | [ | |
| ➔ 0, 1 year, 2–4 years | [ | ||
| During intervention | ➔ Over 1 year: followed implementation as it occurred | [ | |
| During and post-intervention | ➔ 1 year post-training, 2 years post-training, 4 years post-training | [ | |
| Post-intervention | ➔ Under 2 years post-intervention ( | [ | |
| ➔ 2 years post-intervention ( | [22, 34, 45, 50, 67, 82 | ||
| ➔ 3 years post-intervention ( | 20, 35, 48, 56, 74, 76] | ||
| ➔ 4 years post-intervention ( | [ | ||
| ➔ 5 years post-intervention ( | [ | ||
| ➔ 6 years post-intervention ( | [ | ||
| ➔ 10 + years post-intervention ( | [ | ||
| Unclear | ➔ Unclear | [ |
aDescribes a measurement taken at a time with no intervention taking place and can be used as a benchmark to compare outcomes once the intervention begins
Fig. 2Sustainability outcomes measured