| Literature DB >> 26418859 |
Anna Kågesten1, Ӧzge Tunçalp2, Moazzam Ali2, Venkatraman Chandra-Mouli2, Nhan Tran3, A Metin Gülmezoglu2.
Abstract
BACKGROUND: Complete and accurate reporting of programme preparation, implementation and evaluation processes in the field of sexual and reproductive health (SRH) is essential to understand the impact of SRH programmes, as well as to guide their replication and scale-up.Entities:
Mesh:
Year: 2015 PMID: 26418859 PMCID: PMC4852887 DOI: 10.1371/journal.pone.0138647
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Fig 1PRISMA 2009 Flowchart of screening and data extraction process.
The majority of articles (96%) were published in peer-reviewed journals, the most common being BMC Medical Education or BMC Medical Research Methodology (8%), PLoS Medicine (7%) and Journal of Clinical Epidemiology (7%). The included articles corresponded to 45 tools (Table 1) retained for synthesis.
Overview of included tools, by relevance to the current systematic review.
|
| ||||
|
|
|
|
|
|
| 1. Complexity spectrum checklist | Randomized controlled trials | Complex intervention trials | 14 | [ |
| 2. CONSORT statement–proposed addition | Randomized controlled trials | Implementation reporting | N/A | [ |
| 3. CONSORT–SPI statement (on-going development) | Randomized controlled trials | Social and psychological intervention trials | N/A | [ |
| 4. CONSORT statement–unofficial extension | Randomized controlled trials | Behavioural medicine intervention trials | 22 | [ |
| 5. Reporting on development and evaluation of complex interventions in healthcare (CReDECI) guideline | Intervention research, all study designs | Development and evaluation of complex interventions in healthcare | 16 | [ |
| 6. Guidelines for reporting evidence-based practice educational interventions (GREET) statement (on-going development) | Intervention research, all study designs | Description of educational evidence-based practice strategies | N/A | [ |
| 7. Implementation research framework for health sciences | Implementation research | Framework for implementation research in health | N/A | [ |
| 8. Oxford Implementation Index | Systematic reviews | Implementation data in systematic reviews | 17 | [ |
| 9. Program evaluation and monitoring system (PEMS) | Program evaluation and monitoring | HIV Prevention | 8 | [ |
| 10. PROGRESS-Plus checklist | Intervention research, all study designs | Equity lens for reporting on interventions | N/A | [ |
| 11. Reporting of HIV interventions | Intervention research, all study designs | Quality of study methods in HIV prevention interventions | 11 | [ |
| 12. Reporting of implementation for injury prevention initiatives | Systematic reviews | Injury prevention implementation | N/A | [ |
| 13. Reporting of nursing interventions | Intervention research, all study designs | Content of complex nursing interventions | 20 | [ |
| 14. Reporting of public health interventions | Intervention research, all study designs | Public health interventions | N/A | [ |
| 15. Reporting of tailored interventions | Intervention research, all study designs | Tailored interventions | 7 | [ |
| 16. Structured assessment of feasibility (SAFE) checklist | Intervention research, all study designs | Feasibility of complex interventions in mental health services | 16 | [ |
| 17. Standards for quality improvement reporting excellence (SQUIRE) guidelines | Intervention research, all study designs | Quality improvement interventions | 19 | [ |
| 18. Integrated checklist for determinants of practice (TICD) | Determinants of practice | Health care and chronic disease | 53 | [ |
| 19. Template for intervention description and replication (TIDieR) statement | Intervention research, all study designs | Description of interventions | 12 | [ |
| 20. Transparent Reporting of Evaluations with Nonrandomized Designs (TREND) statement | Intervention research, non-randomized design | Evaluation of public health/behavioural interventions | 22 | [ |
| 21. Workgroup for intervention development and evaluation research (WIDER) statement | Intervention research, all study designs | Components of behaviour change interventions | 24 | [ |
|
| ||||
|
|
|
|
|
|
| 22. CONSORT statement | Randomized controlled trials | General | 25 | [ |
| 23. CONSORT statement–extension | Randomized controlled trials | Non-pharmacological treatments | 22 | [ |
| 24. CONSORT–EHEALTH statement | Intervention research, all study designs | Evaluations of web-based and mobile health interventions | 25 | [ |
| 25. CONSORT statement–extension | Randomized controlled trials | Pragmatic trials | 22 | [ |
| 26. Consolidated criteria for reporting qualitative research (COREQ) statement | Qualitative studies | Interviews and focus groups | 32 | [ |
| 27. Guidance for Reporting Involvement of Patients and Public (GRIPP) checklist | Patients and public involvement in research | Health technology/health services | 10 | [ |
| 28. Preferred reporting standards in systematic reviews and meta-analyses (PRISMA) statement | Systematic reviews | General | 27 | [ |
| 29. PRISMA–Equity statement | Systematic reviews | Health equity | 27 | [ |
| 30. Reporting of internet interventions | Intervention research, all study designs | Internet | 12 | [ |
| 31. Reporting of public health programs in Colorado | Program reporting | Public health program reporting system | N/A | [ |
| 32. Statement on reporting of evaluation studies in health informatics (STARE–HI) | Evaluation studies | Evaluation of health informatics systems | 14 | [ |
|
| ||||
|
|
|
|
|
|
| 33. Checklist for systematic reviews of non-randomized studies | Systematic reviews, non-randomized designs | Non-randomized studies of health care interventions | 4 | [ |
| 34. Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement | Intervention research, economic evaluations | Economic evaluations of health care interventions | 24 | [ |
| 35. Checklist to evaluate report of non-pharmacological trials (CLEAR NPT) | Randomized controlled trials | Non-pharmacological treatments | 10 | [ |
| 36. Enhancing transparency in reporting the synthesis of qualitative research (ENTREQ) | Systematic reviews | Qualitative research synthesis | 21 | [ |
| 37. International Society for Pharmacoeconomics and Outcomes Research checklist for cost-effectiveness analysis alongside clinical trials (ISPOR RCT-CEA) | Randomized controlled trials | Cost-effectiveness alongside clinical trials | 27 | [ |
| 38. Reporting guidelines for observational longitudinal studies | Observational studies | Longitudinal health and medical research | 33 | [ |
| 39. Reporting guidelines for survey research | Survey research | General | 38 | [ |
| 40. Reporting qualitative research in health informatics (REQ-HI) recommendations | Qualitative research | Health informatics | 14 | [ |
| 41. Rural and Remote Health Journal guideline | All study designs | Rural and remote health | 15 | [ |
| 42. Standards for reporting on diagnostic accuracy studies (STARD) statement | Diagnostic accuracy studies | General | 25 | [ |
| 43. Strengthening reporting of genetic associations (STREGA) statement | Observational | Genetic association studies | 22 | [ |
| 44. STandards for Reporting Interventions in Clinical Trials of Acupuncture (STRICTA) statement | Intervention research, all study designs | Acupuncture | 6 | [ |
| 45. STrengthening the Reporting of OBservational studies in Epidemiology (STROBE statement) | Observational | General | 22 | [ |
Note: N/A means that the tool did not present an official list of items, but included a narrative description of important reporting elements.
Reporting items related to programme preparation, implementation and evaluation.
| Domain | Sub-domain | Item | Description |
|---|---|---|---|
| Programme preparation | Objective/Focus | 1. Programme name | Name of programme [ |
| 2. Objectives and anticipated impact of programme (why) | Anticipated short-term and long-term influences of programme on individual participants as well as wider implications [ | ||
| 3. Target population | Characteristics of the target population planned to be reached and at what level (individual, group, wider population) [ | ||
| Design | 4. Organization/agency | Mention the name, credentials and affiliations of the organization(s) developing the programme [ | |
| 5. Funding source | Name of programme donor/funding source(s) [ | ||
| 6. Programme design process | Description of the process of designing the programme [ | ||
| 7. Theoretical foundation | Underlying theory and/or logic model of the programme [ | ||
| 8. Program manual | Whether a manual or protocol existed for the programme [ | ||
| 9. Implementation strategy | Details on whether an implementation strategy was developed [ | ||
| 10. Evaluation plans | Detail any evaluation plans, both to assess programme implementation/process and to evaluate the programme’s impact/results [ | ||
| Piloting | 11. Piloting of activities | Whether programme activities were piloted, and if so detail how, when, by whom and the results [ | |
| Programme implementation | Content | 12. Components/activities | Define and describe the content of programme activities in enough detail to allow replication [ |
| 13. Complexity | Degree of complexity of the activities, such as whether single or multiple components were included [ | ||
| 14. Standardisation | Whether the content of components/activities followed a standardised protocol or curriculum [ | ||
| 15. Innovation | Degree of innovation as part of the programme [ | ||
| 16. Materials | Type of materials used [ | ||
| Timing, duration, location | 17. Timing (when) | Timing and duration of the programme (start and finish) [ | |
| 18. Setting (where) | Key aspects of the programme setting [ | ||
| 19. Dose and intensity (how much) | Number of sessions/activities, how often activities were delivered [ | ||
| Providers/staff | 20. Provider characteristics (Who) | Organization(s)/agencies involved in delivering the programme activities [ | |
| 21. Provider/staff training | Details on how programme staff was recruited, trained and supervised to deliver activities (when, how and by whom) [ | ||
| 22. Provider reflexivity | Reflection about the relationship between providers and participants, such as whether participants knew the staff [ | ||
| Participants | 23. Participant recruitment | Process of recruiting programme participants [ | |
| 24. Participants (to whom) | Characteristics of participants that actually received the programme [ | ||
| 25. Participant preparation | Whether anything was done to prepare or brief participants prior to the start of the programme [ | ||
| Delivery | 26. Methods used to deliver activities (how) | Specific methods/channels used for delivering programme activities [ | |
| 27. Efforts to ensure fidelity of participants | Efforts to ensure fidelity, increase participation, compliance or adherence, and reduce contamination [ | ||
| 28. Efforts to ensure fidelity of providers/staff | Efforts to enhance adherence of providers [ | ||
| Implementation outcomes | 29. Acceptability | Perception and comfort among stakeholders about the programme, its relative advantage and credibility [ | |
| 30. Appropriateness | Perceived fit or relevance of the intervention as judged by the implementers [ | ||
| 31. Feasibility/practicality | The actual fit, utility or suitability of the programme for the everyday life of participants [ | ||
| 32. Adoption | Uptake/utilization of programme [ | ||
| 33. Coverage/Reach | The spread or penetration of the programme components [ | ||
| 34. Attrition | Non-participation and dropout of participants [ | ||
| 35. Unexpected end of programme | Whether the programme ended or stopped earlier than planned, along with reasons for why [ | ||
| 36. Reversibility | Whether it would be possible to stop the programme without negative or harmful effects [ | ||
| 37. Contamination of activities | Unanticipated spread of activities outside of the programme target population [ | ||
| 38. Fidelity | Whether the programme was delivered as intended, e.g. discrepancies between the programme design and the actual implementation of components and methods in the "real life context" [ | ||
| 39. Reasons for low fidelity | Reasons for any deviation from planned activities or others parts of the programme design [ | ||
| 40. Sustainability | Extent to which participants may be able to use the programme in their everyday life, for example whether any support structures are in place to maintain behaviour changes [ | ||
| 41. Costs of implementation | Costs and required resources for implementation [ | ||
| Programme evaluation | Process evaluation | 42. Process or implementation evaluation methods | Method that was used to assess implementation outcomes [ |
| 43. Effect of implementation process on results | Whether the implementation process affected results and quality of the programme results [ | ||
| 44. External events affecting implementation | Significant external events occurring at the time of intervention (e.g. social political, economic and/or geographical), which might have affected the implementation [ | ||
| 45. Ethical considerations | Ethical issues that might have affected the implementation [ | ||
| Implementation barriers and facilitators | 46. Implementation barriers and facilitators | Detailed description of factors hindering and facilitating implementation of the programme [ | |
| 47. Strengths and limitations | Appraise weaknesses [ | ||
| Impact/results evaluation | 48. Outcome evaluation methods | How programme results/impact was evaluated [ | |
| 49. Unexpected/negative effects | Any unexpected and/or negative effects of the programme [ | ||
| 50. Differential effects | Whether the programme effects differed according based on characteristics such as biological sex/gender, ethnicity, socioeconomic status, age, geographic location [ |