| Literature DB >> 27538384 |
Michelle M Haby1,2, Evelina Chapman3, Rachel Clark4, Jorge Barreto5, Ludovic Reveiz6, John N Lavis7,8.
Abstract
BACKGROUND: The objective of this work was to inform the design of a rapid response program to support evidence-informed decision-making in health policy and practice for the Americas region. Specifically, we focus on the following: (1) What are the best methodological approaches for rapid reviews of the research evidence? (2) What other strategies are needed to facilitate evidence-informed decision-making in health policy and practice? and (3) How best to operationalize a rapid response program?Entities:
Mesh:
Year: 2016 PMID: 27538384 PMCID: PMC4990866 DOI: 10.1186/s13012-016-0472-9
Source DB: PubMed Journal: Implement Sci ISSN: 1748-5908 Impact factor: 7.327
Areas where “shortcuts” could be considered to reduce time to completion of rapid reviews
| Systematic review step | Possible “shortcuts” | Potential impact on the validity of the results | Relevant AMSTAR question and potential impact of shortcut on AMSTAR score |
|---|---|---|---|
| Preparation of a protocol | • Omit protocol | Unknown | Q1. Loss of one point if a protocol is not prepared and/or not mentioned in report |
| Question formulation | • Limit the number of questions and sub-questions | None expected | |
| Selecting relevant studies | • One reviewer screens titles and abstracts | Unknown, though one reviewer could miss up to 9 % of eligible randomized controlled trials [ | Q2. Loss of one point if only one reviewer does screening and/or only one reviewer does data extraction |
| Data extraction | • One reviewer extracts data | Can increase the number of errors but the impact on results is not known [ | |
| • One reviewer extracts data with checking by a second reviewer | Unknown | ||
| • Data extraction limited to key characteristics, results, conflicts of interest | Unknown | ||
| Literature search | • Limit number of databases searched | Limiting the number of databases searched can increase efficiency without compromising validity [ | Q3. Loss of one point if less than two databases searched and/or no supplementary strategies |
| Inclusion criteria | |||
| Gray literature | • Limit or omit gray literature | Could introduce publication bias but the evidence is mixed [ | Q4. Loss of one point if gray literature omitted |
| Language | • English only | Effect can vary depending on the question [ | |
| Dates | • Narrow time frame, e.g., last 5 or 10 years | None expected | |
| Study types | • Restrict study types to systematic reviews (and economic evaluations) | None expected [ | |
| Quality assessment | • Limit or omit quality assessment | Not recommended. Several authors suggest that, where resources are limited, priority should be given to quality assessment rather than extensive searching [ | Q7 and Q8. Loss of two points if not assessed, documented and used in formulation of conclusions |
| • Omit “a priori” specification | Unknown | ||
| Data synthesis | • Narrative synthesis only (no meta-analysis) | Unknown – meta-analysis can increase power and precision but also has potential to mislead if not applied appropriately and done correctly [ | Q9. None if explained that meta-analysis not possible due to heterogeneity. If not, loss of one point |
| Assessment of publication bias | • Omit | Unknown | Q10. Loss of one point if omitted |
| Assessment of conflict of interest | • Omitted for individual studies and/or for systematic review | Unknown | Q11. Loss of one point if omitted |
| Report | • Information included limited | Unknown but can impact on AMSTAR score if insufficient detail of methods provided to enable a quality assessment. Sufficient detail of methods will help the reviewer to assess the validity of the results [ | Q1–11. Potential large loss of points if key AMSTAR questions not covered |
| External peer review | • Omit or limit | Unknown |
Rapid response programs selected for case studies
| Rapid response program | Reason for selection | References |
|---|---|---|
| Cochrane response by Cochrane Innovations, Cochrane Collaborationa | Has a potential global reach and is supported by the expertise and experience of the Cochrane Collaboration, though is still in development | |
| McMaster Health Forum Rapid Response Program, McMaster University, Canada | Comes from a developed country with a strong history in knowledge translation, potentially national reach | [ |
| Regional East African Community Health (REACH) Policy Initiative, Uganda | Comes from a low-income economy and has a published evaluation | [ |
| Sax Institute Evidence Check Program, NSW, Australia | Has a long history (since 2007) and is a state-level program | [ |
aCochrane Innovations is a trading company wholly owned by The Cochrane Collaboration
Features of the four rapid response models developed as case studies
| Model | Started | Reach | How funded | Time to complete a RR | External review of RR | RRs made publicly available | Lag period before publication |
|---|---|---|---|---|---|---|---|
| Cochrane response | 2013 | Potentially global |
| ≈8 weeks (first review took 12 weeks) | Yes | All | Yes |
| McMaster Health Forum Rapid Response Program | 2012 | Potentially national (Canada) |
| Max 6 weeks | Yes | All | Yes |
| REACH Policy Initiative | 2010 | National (Uganda) |
| Max 4 weeks | Yes | Not reported | Not reported |
| Sax Institute Evidence Check Program | 2006 | State, potentially national (Australia) |
| ≈12–16 weeks | No | Most (some kept confidential if requested by funder) | Yes |
RR rapid review
Operational issues highlighted by the case studies
| Issue | Description of issue |
|---|---|
| Contracts and intellectual property | If it is a “user-pays” model the use of a contract can slow the process down. However, the impact can be minimized by operating in good faith and starting the review before the contract is signed (e.g. Cochrane Innovations, Sax Institute). |
| External review of the rapid review | External review or “merit review” has the potential to slow the process down if reviewers don’t respond quickly but the different services all seem to have found ways to manage this well, e.g., approaching another reviewer if the first one can’t commit to a quick response. |
| Staffing | Recruiting staff with the right mix of skills and qualifications was noted as an issue for the REACH Policy Initiative model. The other three models used mentoring or internal training to address this issue, with the Sax Institute key informant noting that the Institute also had plans to develop a formal training program for researchers. |
| Evaluation | None of the models have formally evaluated the impact of the service on the uptake of research evidence for policy and/or practice—though there are plans to do this for the McMaster service [ |
| Issues particular to developing countries | Having a fast and reliable internet connection was noted as an issue for the REACH Policy Initiative model. |