| Literature DB >> 29316980 |
Annette M O'Connor1, Guy Tsafnat2, Stephen B Gilbert3, Kristina A Thayer4, Mary S Wolfe5.
Abstract
The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.Entities:
Keywords: Automation; Data abstraction; Data extraction; Evidence synthesis; Priority ranking; Systematic review; Tools
Mesh:
Year: 2018 PMID: 29316980 PMCID: PMC5759184 DOI: 10.1186/s13643-017-0667-4
Source DB: PubMed Journal: Syst Rev ISSN: 2046-4053
Guiding principles proposed at 1st ICASR meeting in Vienna (http://ebrnetwork.org/the-vienna-principles/)
| • Systematic reviews involve multiple tasks, each with different issues, but all must be improved. |
| • Automation may assist with all tasks, from scoping reviews to identifying research gaps as well protocol development to writing and dissemination of the review. |
| • The processes for each task can and should be continuously improved to be more efficient and more accurate. |
| • Automation can and should facilitate the production of systematic reviews that adhere to high standards for the reporting, conduct, and updating of rigorous reviews. |
| • Developments should also provide for flexibility in combination uses, e.g., subdividing or merging steps and allowances for different users to use different interfaces. |
| • Different groups with different expertise are working on different parts of the problem; to improve reviews as a whole will require collaboration between these groups. |
| • Every automation technique should be shared, preferably by making code, evaluation data, and corpora available for free |
| • All automation techniques and tools should be evaluated using a recommended and replicable method with results and data reported. |
Challenges to automation identified by meeting participants and invited speakers
| Broader challenges |
| • Social acceptance of automation technology |
| • Development of flexible systems for different disciplines |
| • Acquiring resources for development |
| • Fostering collaboration in a competitive environment |
| • Keeping up with rapidly evolving technologies and approaches, such as open data |
| • Making automation approaches compatible with stakeholder transparency needs, that is, the “black box” nature of many technologies such as machine learning |
| Technological challenges |
| • Designing an application programming interface that meets the needs of multiple scientific domains and goals for different systematic reviews |
| • Integrating an application programming interface into both new and existing software tools |
| • Creating cross-compatibility of tools |
| • Addressing issues of intellectual property |
| • Meeting review-specific/data-specific challenges |
| • Extracting data from full texts |
| • Developing approaches for algorithm and tool validation |