Judith Streak Gomersall1, Yuri Tertilus Jadotte, Yifan Xue, Suzi Lockwood, Dru Riddle, Alin Preda. 1. 1NHMRC Centre of Research Excellence in Aboriginal Chronic Disease Knowledge Translation and Exchange (CREATE) 2Joanna Briggs Institute University of Adelaide 3Division of Science, Rutgers School of Nursing 4Northeast Institute for Evidence Synthesis and Translation (NEST): A Collaborating Centre of the Joanna Briggs Institute 5Texas Christian University Center for Evidence Based Practice and Research: A Collaborating Centre of the Joanna Briggs Institute 6Romanian Centre for Evidence Based Nursing and Midwifery: An Affiliate Centre of the Joanna Briggs Institute, University of Adelaide, Adelaide, Australia.
Abstract
BACKGROUND: In 2012, a working group was established to review and enhance the Joanna Briggs Institute (JBI) guidance for conducting systematic review of evidence from economic evaluations addressing a question(s) about health intervention cost-effectiveness. OBJECTIVES: The objective is to present the outcomes of the working group. METHODS: The group conducted three activities to inform the new guidance: review of literature on the utility/futility of systematic reviews of economic evaluations and consideration of its implications for updating the existing methodology; assessment of the critical appraisal tool in the existing guidance against criteria that promotes validity in economic evaluation research and two other commonly used tools; and a workshop. RESULTS: The debate in the literature on the limitations/value of systematic review of economic evidence cautions that systematic reviews of economic evaluation evidence are unlikely to generate one size fits all answers to questions about the cost-effectiveness of interventions and their comparators. Informed by this finding, the working group adjusted the framing of the objectives definition in the existing JBI methodology. The shift is away from defining the objective as to determine one cost-effectiveness measure toward summarizing study estimates of cost-effectiveness and informed by consideration of the included study characteristics (patient, setting, intervention component, etc.), identifying conditions conducive to lowering costs and maximizing health benefits. The existing critical appraisal tool was included in the new guidance. The new guidance includes the recommendation that a tool designed specifically for the purpose of appraising model-based studies be used together with the generic appraisal tool for economic evaluations assessment to evaluate model-based evaluations. The guidance produced by the group offers reviewers guidance for each step of the systematic review process, which are the same steps followed in JBI reviews of other types of evidence. DISCUSSION: The updated JBI guidance will be useful for researchers wanting to synthesize evidence about economic questions, either as stand-alone reviews or part of comprehensive or mixed method evidence reviews. Although the updated methodology produced by the work of the working group has improved the JBI guidance for systematic reviews of economic evaluations, there are areas where further work is required. These include adjusting the critical appraisal tool to separate out questions addressing intervention cost and effectiveness measurement; providing more explicit guidance for assessing generalizability of findings; and offering a more robust method for evidence synthesis that facilitates achieving the more ambitious review objectives.
BACKGROUND: In 2012, a working group was established to review and enhance the Joanna Briggs Institute (JBI) guidance for conducting systematic review of evidence from economic evaluations addressing a question(s) about health intervention cost-effectiveness. OBJECTIVES: The objective is to present the outcomes of the working group. METHODS: The group conducted three activities to inform the new guidance: review of literature on the utility/futility of systematic reviews of economic evaluations and consideration of its implications for updating the existing methodology; assessment of the critical appraisal tool in the existing guidance against criteria that promotes validity in economic evaluation research and two other commonly used tools; and a workshop. RESULTS: The debate in the literature on the limitations/value of systematic review of economic evidence cautions that systematic reviews of economic evaluation evidence are unlikely to generate one size fits all answers to questions about the cost-effectiveness of interventions and their comparators. Informed by this finding, the working group adjusted the framing of the objectives definition in the existing JBI methodology. The shift is away from defining the objective as to determine one cost-effectiveness measure toward summarizing study estimates of cost-effectiveness and informed by consideration of the included study characteristics (patient, setting, intervention component, etc.), identifying conditions conducive to lowering costs and maximizing health benefits. The existing critical appraisal tool was included in the new guidance. The new guidance includes the recommendation that a tool designed specifically for the purpose of appraising model-based studies be used together with the generic appraisal tool for economic evaluations assessment to evaluate model-based evaluations. The guidance produced by the group offers reviewers guidance for each step of the systematic review process, which are the same steps followed in JBI reviews of other types of evidence. DISCUSSION: The updated JBI guidance will be useful for researchers wanting to synthesize evidence about economic questions, either as stand-alone reviews or part of comprehensive or mixed method evidence reviews. Although the updated methodology produced by the work of the working group has improved the JBI guidance for systematic reviews of economic evaluations, there are areas where further work is required. These include adjusting the critical appraisal tool to separate out questions addressing intervention cost and effectiveness measurement; providing more explicit guidance for assessing generalizability of findings; and offering a more robust method for evidence synthesis that facilitates achieving the more ambitious review objectives.
Authors: Letícia P Leonart; Helena H L Borba; Vinicius L Ferreira; Bruno S Riveros; Roberto Pontarolo Journal: Pituitary Date: 2018-12 Impact factor: 4.107
Authors: Jan L Brozek; Carlos Canelo-Aybar; Elie A Akl; James M Bowen; John Bucher; Weihsueh A Chiu; Mark Cronin; Benjamin Djulbegovic; Maicon Falavigna; Gordon H Guyatt; Ami A Gordon; Michele Hilton Boon; Raymond C W Hutubessy; Manuela A Joore; Vittal Katikireddi; Judy LaKind; Miranda Langendam; Veena Manja; Kristen Magnuson; Alexander G Mathioudakis; Joerg Meerpohl; Dominik Mertz; Roman Mezencev; Rebecca Morgan; Gian Paolo Morgano; Reem Mustafa; Martin O'Flaherty; Grace Patlewicz; John J Riva; Margarita Posso; Andrew Rooney; Paul M Schlosser; Lisa Schwartz; Ian Shemilt; Jean-Eric Tarride; Kristina A Thayer; Katya Tsaioun; Luke Vale; John Wambaugh; Jessica Wignall; Ashley Williams; Feng Xie; Yuan Zhang; Holger J Schünemann Journal: J Clin Epidemiol Date: 2020-09-24 Impact factor: 6.437
Authors: T I Armina Padmasawitri; Gerardus W Frederix; Bachti Alisjahbana; Olaf Klungel; Anke M Hövels Journal: PLoS One Date: 2018-05-09 Impact factor: 3.240