Suzanne Shurtz1, Margaret J Foster. 1. Instructional Service Librarian, Medical Sciences Library, Texas A&M University, 4462 TAMU College Station, TX 77843-4462, USA. sshurtz@library.tamu.edu
Abstract
OBJECTIVE: The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. METHODS: The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). RESULTS: Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. CONCLUSIONS: As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed.
OBJECTIVE: The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. METHODS: The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). RESULTS: Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. CONCLUSIONS: As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed.
Authors: David Atkins; Dana Best; Peter A Briss; Martin Eccles; Yngve Falck-Ytter; Signe Flottorp; Gordon H Guyatt; Robin T Harbour; Margaret C Haugh; David Henry; Suzanne Hill; Roman Jaeschke; Gillian Leng; Alessandro Liberati; Nicola Magrini; James Mason; Philippa Middleton; Jacek Mrukowicz; Dianne O'Connell; Andrew D Oxman; Bob Phillips; Holger J Schünemann; Tessa Tan-Torres Edejer; Helena Varonen; Gunn E Vist; John W Williams; Stephanie Zaza Journal: BMJ Date: 2004-06-19
Authors: Mark H Ebell; Jay Siwek; Barry D Weiss; Steven H Woolf; Jeffrey Susman; Bernard Ewigman; Marjorie Bowman Journal: J Am Board Fam Pract Date: 2004 Jan-Feb
Authors: Rita Banzi; Alessandro Liberati; Ivan Moschetti; Ludovica Tagliabue; Lorenzo Moja Journal: J Med Internet Res Date: 2010-07-07 Impact factor: 5.428
Authors: Sally L Baxter; Lina Lander; Brian Clay; John Bell; Kristen Hansen; Amanda Walker; Ming Tai-Seale Journal: Appl Clin Inform Date: 2022-02-02 Impact factor: 2.342
Authors: Koren Hyogene Kwag; Marien González-Lorenzo; Rita Banzi; Stefanos Bonovas; Lorenzo Moja Journal: J Med Internet Res Date: 2016-01-19 Impact factor: 5.428
Authors: Stijn Van de Velde; Robert Vander Stichele; Benjamin Fauquert; Siegfried Geens; Annemie Heselmans; Dirk Ramaekers; Ilkka Kunnamo; Bert Aertgeerts Journal: JMIR Res Protoc Date: 2013-07-10