| Literature DB >> 25005128 |
Guy Tsafnat1, Paul Glasziou, Miew Keen Choong, Adam Dunn, Filippo Galgani, Enrico Coiera.
Abstract
Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.Entities:
Mesh:
Year: 2014 PMID: 25005128 PMCID: PMC4100748 DOI: 10.1186/2046-4053-3-74
Source DB: PubMed Journal: Syst Rev ISSN: 2046-4053
Figure 1Existing methods for systematic reviews follow these steps with some variations. Not all systematic reviews follow all steps. This process typically takes between 12 and 24 months. Adapted from the Cochrane [10] and CREBP [11] Manuals for systematic reviews. SR systematic review, SD standard deviation.
Examples of tools used for the automation of evidence synthesis tasks
| Search | Quick Clinical | Federated meta-search engine | Limited source databases not optimized for systematic reviews |
| Search | Sherlock | Search engine for trial registries | Limited to clinicaltrials.gov |
| Search | Metta | Federated meta-search engine for SR | Not available publicly |
| Snowballing | ParsCit | Reference string extraction from published papers | Does not fetch nor recursively pursue citations |
| Screen titles and abstracts | Abstrackr | Machine learning -based abstract screening tool | May reduce review recall by up to 5% |
| Extract data | ExaCT | PICO and other information element extraction from abstracts | No association (e.g. of outcome with trial arm), results only available in HTML |
| Extract data | WebPlotDigitizer | Re-digitization of data from graphs and plots. | No support for survival curves, no optical character recognition |
| Meta-analyze | Meta-analyst | Create a meta-analysis from extracted data | Limited integration with data-extraction and conversion programs |
| Write-up | RevMan-HAL | Automatic summary write-up from extracted data. | Only works with RevMan files |
| Write-up | PRISMA Flow Diagram Generator | Automatic generation of PRISMA diagrams | Does not support some complex diagrams |
Figure 2A screen capture of the Quick Clinical query screen from the smartphone app version. The Profile pull-down menu lets one select the class of question being asked (e.g. medication, diagnosis, patient education). The query fields are chosen to suit the question class. The four query fields shown (Disease, Drug, Symptom and Other) are taken from the Therapy question class.