David W Redick1, Jodi C Hwang2, Amy Kloosterboer3, Nicolas A Yannuzzi1, Nimesh A Patel1, Ajay E Kuriyan4,5, Jayanth Sridhar1. 1. Bascom Palmer Eye Institute, Department of Ophthalmology, University of Miami Miller School of Medicine, Miami, Florida, USA. 2. Ophthalmology Department, University of Miami Miller School of Medicine, Miami, FL, USA. 3. University Hospitals Cleveland Medical Center, Department of Ophthalmology, Case Western Reserve University, Cleveland, OH, USA. 4. Mid-Atlantic Retina, The Retina Service, Wills Eye Institute, Thomas Jefferson University, Philadelphia, Pennsylvania, USA. 5. David and Ilene Flaum Eye Institute, Department of Ophthalmology, University of Rochester Medical Center, Rochester, NY, USA.
Abstract
PURPOSE: To assess content, readability, and accountability of online information for patients regarding epiretinal membranes (ERMs). METHODS: Cross-sectional study evaluating nine major medical websites on ERMs. Fifteen questions assessed patient-relevant content. Four indices estimated U.S. grade literacy level of the text. JAMA benchmarks (authorship, attribution, disclosure, currency) evaluated website accountability. RESULTS: Average content score was 36.78 (SD 13.91, 95% CI ±0.64) from a possible maximum of 60, with significant variability between websites (H = 22.68, p=0.004). Mean reading grade level was 12.29 (SD 2.30, 95% CI ±1.50). No website achieved all JAMA benchmarks; only one website fulfilled three of the four. Content score did not correlate with Google rank (order of listed websites, r = -0.23, p=0.55) or JAMA benchmarks (r = 0.19, p=0.62) but significantly correlated with mean reading grade (r = 0.67, p=0.05). CONCLUSION: Online information regarding ERMs varies significantly, may not adequately answer common patient questions, and is written at too complex a literacy level for the average patient.
PURPOSE: To assess content, readability, and accountability of online information for patients regarding epiretinal membranes (ERMs). METHODS: Cross-sectional study evaluating nine major medical websites on ERMs. Fifteen questions assessed patient-relevant content. Four indices estimated U.S. grade literacy level of the text. JAMA benchmarks (authorship, attribution, disclosure, currency) evaluated website accountability. RESULTS: Average content score was 36.78 (SD 13.91, 95% CI ±0.64) from a possible maximum of 60, with significant variability between websites (H = 22.68, p=0.004). Mean reading grade level was 12.29 (SD 2.30, 95% CI ±1.50). No website achieved all JAMA benchmarks; only one website fulfilled three of the four. Content score did not correlate with Google rank (order of listed websites, r = -0.23, p=0.55) or JAMA benchmarks (r = 0.19, p=0.62) but significantly correlated with mean reading grade (r = 0.67, p=0.05). CONCLUSION: Online information regarding ERMs varies significantly, may not adequately answer common patient questions, and is written at too complex a literacy level for the average patient.
Entities:
Keywords:
Consumer health informatics; epiretinal membrane; internet; patient education
Authors: Daniel M Baker; Jack H Marshall; Matthew J Lee; Georgina L Jones; Steven R Brown; Alan J Lobo Journal: Inflamm Bowel Dis Date: 2017-08 Impact factor: 5.325
Authors: Samantha Fraser-Bell; Magdalena Guzowski; Elena Rochtchina; Jie Jin Wang; Paul Mitchell Journal: Ophthalmology Date: 2003-01 Impact factor: 12.079
Authors: Daniel J Amante; Timothy P Hogan; Sherry L Pagoto; Thomas M English; Kate L Lapane Journal: J Med Internet Res Date: 2015-04-29 Impact factor: 5.428