| Literature DB >> 28264797 |
Hilary Pinnock1, Melanie Barwick2, Christopher R Carpenter3, Sandra Eldridge4, Gonzalo Grandes5, Chris J Griffiths6, Jo Rycroft-Malone7, Paul Meissner8, Elizabeth Murray9, Anita Patel6, Aziz Sheikh10, Stephanie J C Taylor6.
Abstract
Implementation studies are often poorly reported and indexed, reducing their potential to inform initiatives to improve healthcare services. The Standards for Reporting Implementation Studies (StaRI) initiative aimed to develop guidelines for transparent and accurate reporting of implementation studies. Informed by the findings of a systematic review and a consensus-building e-Delphi exercise, an international working group of implementation science experts discussed and agreed the StaRI Checklist comprising 27 items. It prompts researchers to describe both the implementation strategy (techniques used to promote implementation of an underused evidence-based intervention) and the effectiveness of the intervention that was being implemented. An accompanying Explanation and Elaboration document (published in BMJ Open, doi:10.1136/bmjopen-2016-013318) details each of the items, explains the rationale, and provides examples of good reporting practice. Adoption of StaRI will improve the reporting of implementation studies, potentially facilitating translation of research into practice and improving the health of individuals and populations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.Entities:
Mesh:
Year: 2017 PMID: 28264797 PMCID: PMC5421438 DOI: 10.1136/bmj.i6795
Source DB: PubMed Journal: BMJ ISSN: 0959-8138

Fig 1 Positioning of implementation studies and the focus of StaRI reporting standards (adapted from fig 12.1. in Brownson et al8)
Standards for Reporting Implementation Studies: the StaRI Checklist of items to be reported
| Checklist item | Implementation strategy | Intervention† | |
|---|---|---|---|
|
|
| Identification as an implementation study, and description of the methodology in the title and/or keywords | |
|
|
| Identification as an implementation study, including a description of the implementation strategy to be tested, the evidence-based intervention being implemented, and defining the key implementation and health outcomes | |
|
|
| Description of the problem, challenge, or deficiency in healthcare or public health that the intervention being implemented aims to address | |
|
| The scientific background and rationale for the implementation strategy (including any underpinning theory, framework, or model, how it is expected to achieve its effects, and any pilot work) | The scientific background and rationale for the intervention being implemented (including evidence about its effectiveness and how it is expected to achieve its effects) | |
|
|
| The aims of the study, differentiating between implementation objectives and any intervention objectives | |
|
|
| The design and key features of the evaluation (cross referencing to any appropriate methodology reporting standards) and any changes to study protocol, with reasons | |
|
| The context in which the intervention was implemented (consider social, economic, policy, healthcare, organisational barriers and facilitators that might influence implementation elsewhere) | ||
|
| The characteristics of the targeted “site(s)” (locations, personnel, resources, etc) for implementation and any eligibility criteria | The population targeted by the intervention and any eligibility criteria | |
|
| A description of the implementation strategy | A description of the intervention | |
|
| Any subgroups recruited for additional research tasks, and/or nested studies are described | ||
|
|
| Defined pre-specified primary and other outcome(s) of the implementation strategy, and how they were assessed. Document any pre-determined targets | Defined pre-specified primary and other outcome(s) of the intervention (if assessed), and how they were assessed. Document any pre-determined targets |
|
| Process evaluation objectives and outcomes related to the mechanism(s) through which the strategy is expected to work | ||
|
| Methods for resource use, costs, economic outcomes, and analysis for the implementation strategy | Methods for resource use, costs, economic outcomes, and analysis for the intervention | |
|
| Rationale for sample sizes (including sample size calculations, budgetary constraints, practical considerations, data saturation, as appropriate) | ||
|
| Methods of analysis (with reasons for that choice) | ||
|
| Any a priori subgroup analyses (such as between different sites in a multicentre study, different clinical or demographic populations) and subgroups recruited to specific nested research tasks | ||
|
|
| Proportion recruited and characteristics of the recipient population for the implementation strategy | Proportion recruited and characteristics (if appropriate) of the recipient population for the intervention |
|
| Primary and other outcome(s) of the implementation strategy | Primary and other outcome(s) of the intervention (if assessed) | |
|
| Process data related to the implementation strategy mapped to the mechanism by which the strategy is expected to work | ||
|
| Resource use, costs, economic outcomes, and analysis for the implementation strategy | Resource use, costs, economic outcomes, and analysis for the intervention | |
|
| Representativeness and outcomes of subgroups including those recruited to specific research tasks | ||
|
| Fidelity to implementation strategy as planned and adaptation to suit context and preferences | Fidelity to delivering the core components of intervention (where measured) | |
|
| Contextual changes (if any) which may have affected outcomes | ||
|
| All important harms or unintended effects in each group | ||
|
|
| Summary of findings, strengths and limitations, comparisons with other studies, conclusions and implications | |
|
| Discussion of policy, practice and/or research implications of the implementation strategy (specifically including scalability) | Discussion of policy, practice and/or research implications of the intervention (specifically including sustainability) | |
|
|
| Include statement(s) on regulatory approvals (including, as appropriate, ethical approval, confidential use of routine data, governance approval), trial or study registration (availability of protocol), funding, and conflicts of interest | |
*Implementation strategy refers to how the intervention was implemented.
†Intervention refers to the healthcare or public health intervention that is being implemented.
Note: A key concept is the dual strands of describing (a) the implementation strategy and (b) the clinical, healthcare, or public health intervention that is being implemented. These strands are represented as two columns in the checklist. The primary focus of implementation science is the implementation strategy (column 1) and the expectation is that this will always be completed. The evidence about the impact of the intervention on the targeted population should always be considered (column 2) and either health outcomes reported or robust evidence cited to support a known beneficial effect of the intervention on the health of individuals or populations. While all items are worthy of consideration, not all items will be applicable to or feasible within every study.
Terminology: definitions and illustration
| Terminology | Definition | Illustration using a study implementing supported self management for asthma |
|---|---|---|
| Implementation science | The scientific study of methods to promote the systematic uptake of evidence-based interventions into practice and policy and hence improve health | Improving implementation in routine practice of evidence-based supported self management for asthma |
| Implementation strategy | Methods or techniques used to enhance the adoption, implementation, and sustainability of an under-utilised intervention | A programme of professional training, templates for reviews, access to resources, facilitation, audit, and feedback |
| Intervention | The evidence-based practice, programme, policy, process, or guideline recommendation that is being implemented | Provision of asthma self management in routine asthma reviews, including completion of action plans |
| Implementation outcome | Process or quality measure to assess the impact of the implementation strategyw24 | Proportion of people with asthma who have an action plan |
| Health outcome | Patient-level health outcomes for a clinical intervention, such as symptoms or mortality; or population-level health status or indices of system function for a system/ organisational-level intervention | Proportion of people with asthma requiring unscheduled care for asthma or patient reported asthma control |
| Logic pathway | The way(s) in which the implementation strategy and intervention are hypothesised to operate | An organisation that prioritises self management encourages or enables trained professionals to provide asthma action plans; self management improves asthma outcomes |
| Fidelity | The degree of adherence to the described implementation strategy and/or the degree to which an intervention is implemented as prescribed in the original protocolw29 | Uptake of professional training, utilisation of review templates (implementation fidelity), and assessment of adequacy of education and completion of action plans (intervention fidelity) |
| Adaptation | The degree to which the strategy and intervention are modified by users during implementation to suit local needs | Use (or not) of telehealth to deliver reviews or provide action plans. Different professionals (doctors, nurses, pharmacists) with primary responsibility for self management education |