| Literature DB >> 30340488 |
Julie M Robillard1,2,3, Jessica H Jun4,5, Jen-Ai Lai4,5, Tanya L Feng4,5.
Abstract
BACKGROUND: Online health information is unregulated and can be of highly variable quality. There is currently no singular quantitative tool that has undergone a validation process, can be used for a broad range of health information, and strikes a balance between ease of use, concision and comprehensiveness. To address this gap, we developed the QUality Evaluation Scoring Tool (QUEST). Here we report on the analysis of the reliability and validity of the QUEST in assessing the quality of online health information.Entities:
Keywords: Instrument validation; Online health information; Quality evaluation; eHealth
Mesh:
Year: 2018 PMID: 30340488 PMCID: PMC6194721 DOI: 10.1186/s12911-018-0668-9
Source DB: PubMed Journal: BMC Med Inform Decis Mak ISSN: 1472-6947 Impact factor: 2.796
Fig. 1Review of existing quality evaluation tools (n = 16). Adapted from the CONSORT 2010 Flow Diagram available at http://www.consort-statement.org/consort-statement/flow-diagram
Fig. 2Description of the QUEST criteria. Scores in the individual sections are weighted and summed to generate a total score of up to 28
Comparison of quality items used in the QUEST, HONcode, Sandvik, and DISCERN tools
| Quality criteria | QUEST | HONcode | Sandvik | DISCERN |
|---|---|---|---|---|
| Attribution | X | X | X | X |
| Currency | X | X | X | X |
| Authorship | X | X | X | |
| Balance | X | X | X | |
| Reliability | X | X | ||
| Interactivity | X | X | ||
| Tone | X | |||
| Conflict of interest | X | |||
| Complementarity | X | |||
| Mission/target | X | |||
| Audience | X | |||
| Privacy | X | |||
| Overall Reliability | X | |||
| Ownership | X | |||
| Navigability | X | |||
| Quality of information on treatment choices | X | |||
| Overall Rating | X |
Weighted Cohen’s kappa, standard error and 95% CI for treatment articles (n = 16)
| Authorship | Attribution | Type of study | Conflict of interest | Currency | Complementarity | Tone | |
|---|---|---|---|---|---|---|---|
| Observed kappa | 0.91 | 0.79 | 1 | 1 | 0.86 | 0.86 | 0.91 |
| SE | 0.08 | 0.10 | 0 | 0.24 | 0.13 | 0.13 | 0.08 |
| 95% CI | 0.75, 1 | 0.58, 0.99 | 1, 1 | 0.32, 1 | 0.60, 1 | 0.60, 1 | 0.75, 1 |
Weighted Cohen’s kappa, standard error and 95% CI for prevention articles (n = 29)
| Authorship | Attribution | Type of study | Conflict of interest | Currency | Complementarity | Tone | |
|---|---|---|---|---|---|---|---|
| Observed kappa | 0.88 | 0.89 | 0.89 | 0.74 | 1 | 0.75 | 0.95 |
| SE | 0.09 | 0.06 | 0.06 | 0.16 | 0 | 0.14 | 0.04 |
| 95% CI | 0.71, 1 | 0.78, 1 | 0.77, 1 | 0.43, 1 | 1, 1 | 0.49, 1 | 0.86, 1 |
Kendall’s tau, standard error, 95% CI, and P-value of each test for treatment articles (n = 16)
| Kendall’s tau (95% CI) | SE | ||
|---|---|---|---|
| QUEST vs HONcode | 0.47 (0.09–0.85) | 0.19 | 0.015 |
| QUEST vs Sandvik | 0.62 (0.23–1.01) | 0.20 | 0.002 |
| QUEST vs DISCERN | 0.65 (0.28–1.02) | 0.19 | < 0.001 |
| HONcode vs Sandvik | 0.53 (0.13–0.92) | 0.20 | 0.009 |
| HONcode vs DISCERN | 0.58 (0.20–0.96) | 0.19 | 0.003 |
| Sandvik vs DISCERN | 0.58 (0.19–0.96) | 0.20 | 0.004 |
Kendall’s tau, standard error, 95% CI, and P-value of each test for prevention articles (n = 29)
| Kendall’s tau (95% CI) | SE | ||
|---|---|---|---|
| QUEST vs HONcode | 0.64 (0.37–0.99) | 0.14 | < 0.001 |
| QUEST vs Sandvik | 0.62 (0.34–0.90) | 0.14 | < 0.001 |
| QUEST vs DISCERN | 0.55 (0.29–0.82) | 0.14 | < 0.001 |
| HONcode vs Sandvik | 0.61 (0.33–0.89) | 0.14 | < 0.001 |
| HONcode vs DISCERN | 0.57 (0.31–0.84) | 0.14 | < 0.001 |
| Sandvik vs DISCERN | 0.41 (0.13–0.68) | 0.14 | 0.004 |
Characteristics of articles (n = 36) retrieved between January 15, 2016 and February 5, 2016 using the following search terms on Google Scholar and PubMed: online, health information, evaluate, evaluation, tool, quality, validity, testing, validation, and assessment and meeting the following inclusion criteria: 1) the article is in the English language; 2) validation of an assessment tool related to quality of health information was the focus of the article
| Focus of article | Number of articles | Article title | Author(s) | Date of Publication |
|---|---|---|---|---|
| Observational or descriptive paper | 5 | Assessing, controlling, and assuring the quality of medical information on the internet: Caveant lector et viewor—let the reader and viewer beware | Silberg WM, Lundberg GD, and Musacchio RA | 1997 |
| The Health On the Net Code of Conduct for Medical and Health Websites | Boyer, C., M. Selby, J. R. Scherrer, and R. D. Appel | 1998 | ||
| Emerging Challenges in Using Health Information from the Internet | Theodosiou, Louise, and Jonathan Green | 2003 | ||
| Health information and the internet: The 5 Cs website evaluation tool | Roberts, Lorraine | 2010 | ||
| Quality of patient health information on the Internet: reviewing a complex and evolving landscape | Fahy, Eamonn, Rohan Hardikar, Adrian Fox, and Sean Mackay | 2014 | ||
| Evaluation of quality of information using tool(s) | 7 | Health information and interaction on the internet: a survey of female urinary incontinence | Sandvik, Hogne | 1999 |
| Evaluation of Websites that Provide Information on Alzheimer’s Disease | Bouchier, H., and P. A. Bath | 2003 | ||
| Accuracy of internet recommendations for prehospital care of venomous snake bites | Barker et al | 2010 | ||
| The quality of online antidepressant drug information: An evaluation of English and Finnish language Web sites | Prusti, Marjo, Susanna Lehtineva, Marika Pohjanoksa-Mäntylä, and J. Simon Bell | 2012 | ||
| Evaluation of dengue-related health information on the Internet | Rao et al | 2012 | ||
| A Methodology to Analyze the Quality of Health Information on the Internet The Example of Diabetic Neuropathy | Chumber, Sundeep, Jörg Huber, and Pietro Ghezzi | 2014 | ||
| Evaluation of Online Health Information on Clubfoot Using the DISCERN Tool | Kumar, Venkatesan S., Suresh Subramani, Senthil Veerapan, and Shah A. Khan | 2014 | ||
| Development of tool | 12 | DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. | Charnock, D., S. Shepperd, G. Needham, and R. Gann | 1999 |
| Development of a self-assessment method for patients to evaluate health information on the Internet | Jones J. | 1999 | ||
| Development and Application of a Tool Designed to Evaluate Web Sites Providing Information on Alzheimer’s Disease | Bath, P. A., and H. Bouchier | 2003 | ||
| Development and validation of an international appraisal instrument for assessing the quality of clinical practice guidelines: the AGREE project | Cluzeau et al. | 2003 | ||
| Design and testing of a tool for evaluating the quality of diabetes consumer-information web sites. Journal of Medical Internet Research | Seidman, Joshua J, Donald Steinwachs, and Haya R Rubin | 2003 | ||
| The development of QUADAS: a tool for the quality assessment of studies of diagnostic accuracy included in systematic reviews | Whiting, Penny, Anne WS Rutjes, Johannes B. Reitsma, Patrick MM Bossuyt, and Jos Kleijnen | 2003 | ||
| Ensuring Quality Information for Patients: Development and Preliminary Validation of a New Instrument to Improve the Quality of Written Health Care Information | Moult, Beki, Linda S Franck, and Helen Brady | 2004 | ||
| A model for online consumer health information quality | Stvilia, Besiki, Lorri Mon, and Yong Jeong Yi | 2009 | ||
| Health Literacy INDEX: Development, Reliability, and Validity of a New Tool for Evaluating the Health Literacy Demands of Health Information Materials | Kaphingst et al. | 2012 | ||
| Measuring the quality of Patients’ goals and action plans: development and validation of a novel tool | Teal, Cayla R., Paul Haidet, Ajay S. Balasubramanyam, Elisa Rodriguez, and Aanand D. Naik | 2012 | ||
| The Communication AssessmenT Checklist in Health (CATCH): a tool for assessing the quality of printed educational materials for clinicians. | Genova, Juliana, Isaac Nahon-Serfaty, Selma Chipenda Dansokho, Marie-Pierre Gagnon, Jean-Sébastien Renaud, and Anik M. C. Giguère | 2014 | ||
| Development and Validation of the Guide for Effective Nutrition Interventions and Education (GENIE): A Tool for Assessing the Quality of Proposed Nutrition Education Programs | Hand, Rosa K., Jenica K. Abram, Katie Brown, Paula J. Ziegler, J. Scott Parrott, and Alison L. Steiber | 2015 | ||
| Evaluation of tool(s) | 9 | Published Criteria for Evaluating Health Related Web Sites: Review | Kim, Paul, Thomas R. Eng, Mary Jo Deering, and Andrew Maxfield | 1999 |
| Examination of instruments used to rate quality of health information on the internet: chronicle of a voyage with an unclear destination | Gagliardi, Anna, and Alejandro R. Jadad | 2002 | ||
| Evaluating the reliability and validity of three tools to assess the quality of health information on the Internet. | Ademiluyi, Gbogboade, Charlotte E Rees, and Charlotte E Sheard | 2003 | ||
| The Evaluation Criteria of Internet Health Information | Kang, Nam-Mi, Sukhwa Kim, Seungkuen Hong, Seewon Ryu, Hye-Jung Chang, and Jeongeun Kim | 2006 | ||
| Assessing the Quality of Websites Providing Information on Multiple Sclerosis: Evaluating Tools and Comparing Sites | Harland, Juliet, and Peter Bath | 2007 | ||
| What Do Evaluation Instruments Tell Us About the Quality of Complementary Medicine Information on the Internet? | Breckons, Matthew, Ray Jones, Jenny Morris, and Janet Richardson | 2008 | ||
| Tools Used to Evaluate Written Medicine and Health Information Document and User Perspectives | Luk, Alice, and Parisa Aslani | 2011 | ||
| Tools for Assessing the Quality and Accessibility of Online Health Information: Initial Testing among Breast Cancer Websites | Whitten, Pamela, Samantha Nazione, and Carolyn Lauckner | 2013 | ||
| Web-site evaluation tools: a case study in reproductive health information | Aslani, Azam, Omid Pournik, Ameen Abu-Hanna, and Saeid Eslami | 2014 | ||
| Systematic literature review of tools | 3 | Empirical Studies Assessing the Quality of Health Information for Consumers on the World Wide Web: A Systematic Review | Eysenbach et al. | 2002 |
| Online Health Information Tool Effectiveness for Older Patients: A Systematic Review of the Literature | Bolle, Sifra, Julia C. M. van Weert, Joost G. Daams, Eugène F. Loos, Hanneke C. J. M. de Haes, and Ellen M. A. Smets. | 2015 | ||
| Quality of Health Information for Consumers on the Web: A Systematic Review of Indicators, Criteria, Tools, and Evaluation Results | Zhang, Yan, Yalin Sun, and Bo Xie | 2015 |
Comparison of evaluation tools previously described in the literature and QUEST
| Name of tool | Focus | Criteria | Format | |
|---|---|---|---|---|
| 0 | QUality Evaluation Scoring Tool (QUEST) | Quality of online health information | Authorship, attribution, conflict of interest, complementarity, currency, tone | 6 questions rated on a scale of 0–2 or 0–1 and differentially weighted, yielding an overall quality score between 0 and 28 |
| 1 | DISCERN | Quality of written information about treatment choices | Reliability, balance, dates, source, quality of information on treatment sources, overall rating | 15 questions rated on a scale of 1–5 |
| 2 | EQIP: Ensuring Quality Information for Patients | Quality of written patient information applicable to all information types | Clarity, patient-oriented design, currency, attributon, conflict of interest, completeness | 20 questions rated Y/Partly/N with an equation to generate a % score |
| 3 | Jones’ Self-Assessment Method | Self-assessment tool for patients to evaluate quality and relevance of health care oriented websites | Content, design, communication, and credibility | 9 broad questions based on 4 criteria rated Yes/No/NA |
| 4 | Health on the Net Foundation’s HONcode Patient Evaluation Tool | Patient evaluation tool for health-related websites | Authorship, attribution, currency, reliability, balance, mission/target audience, privacy, interactivity, overall reliability | 16-item interactive questionnaire returning a % score |
| 5 | Silberg standards | Standards of quality for online medical information for consumers and professionals | Authorship, attribution, disclosure, currency | Set of core standards; no score is generated |
| 6 | Sandvik’s General Quality Criteria | General quality measure for online health information | Ownership, authorship, source, currency, interactivity, navigability, balance | 7 questions rated on a scale of 0–2 |
| 7 | Health Information Technology Institute (HITI) Information Quality Tool * | Quality measure for health-related websites | Credibility, content, disclosure, links, design, interactivity |
|
| 8 | 5 C’s website evaluation tool | Structured guide to systematically evaluating websites; specifically developed for nurses to use in patient care and education | Credibility, currency, content, construction, clarity | Series of 36 open-ended and yes/no questions grouped under the “5 C’s”; no score is generated |
| 9 | Health Literacy INDEX | Tool to evaluate the health literacy demands of health information materials | Plain language, clear purpose, supporting graphics, user involvement, skill-based learning, audience appropriateness, instructions, development details, evaluation methods, strength of evidence | 63 indicators/criteria rated yes/no, yielding criterion-specific scores and an overall % score |
| 10 | Bath and Bouchier’s evaluation tool | Tool to evaluate websites providing information on Alzheimer’s disease | General details, information for carers, currency, ease of use, general conclusions | 47 questions scored from 0 to 2, generating an overall % score |
| 11 | Seidman quality evaluation tool | Quality of diabetes consumer-information websites | Explanation of methods, validity of methods, currency, comprehensiveness, accuracy | 7 structural measures and 34 performance measures, generating composite scores by section and an overall score |
| 12 | Appraisal of Guidelines, REsearch and Evaluation (AGREE) Collaboration instrument | Quality of clinical practice guidelines | Scope and purpose, stakeholder involvement, rigour of development, clarity and presentation, applicability, editorial independence | 23 items grouped into six quality domains with a 4 point Likert scale to score each item |
| 13 | Communication AssessmenT Checklist in Health (CATCH) tool | Quality of printed educational materials for clinicians | Appearance, layout and typography, clarity of content, language and readability, graphics, risk communication, scientific value, emotional appeal, relevance, social value/source credibility, social value/usefulness for the clinician, social value/usefulness for the health care system (hospital or government) | 55 items nested in 12 concepts, each rated yes/no, generating concept-specific and overall scores |
| 14 | LIDA Minervation tool | Evaluates the design and content of healthcare websites | Accessibility, usability (clarity, consistency, functionality, engagability), reliability (currency, conflict of interest, content production) | 41 questions scored on a scale of 0–3, yielding a total % score |
| 15 | Mitretek Information Quality Tool (IQT) * | Evaluates information quality of online health information | Authorship, sponsorship, currency, accuracy, confidentiality, navigability | 21 questions rated yes/no and weighted according to importance, generating a total score between 0 to 4 |
| 16 | “Date, Author, References, Type, and Sponsor” (DARTS) | Assists patients in appraising the quality of online medicines information | Currency, authorship, credibility, purpose, conflict of interest | A series of six guiding questions; no score generated |
| 17 | Quality Index for health-related Media Reports (QIMR) | Monitors the quality of health research reports in the lay media | Background, sources, results, context, validity | 17 items rated on a 0–6 Likert scale with an 18th global rating |
| 18 | Index of Scientific Quality (ISQ) | Index of scientific quality for health reports in the lay press | Applicability, opinions vs. facts, validity, magnitude, precision, consistency, consequences | 7 items rated on a 1–5 Likert scale with an 8th global rating |