Danny T Y Wu1, Annie T Chen2, John D Manning3, Gal Levy-Fix4, Uba Backonja2,5, David Borland6, Jesus J Caban7, Dawn W Dowding8, Harry Hochheiser9, Vadim Kagan10, Swaminathan Kandaswamy11, Manish Kumar12,13, Alexis Nunez, Eric Pan14, David Gotz13,15. 1. Department of Biomedical Informatics, University of Cincinnati, Cincinnati, Ohio, USA. 2. Department of Biomedical Informatics and Medical Education, University of Washington School of Medicine, Seattle, Washington, USA. 3. Department of Emergency Medicine, Atrium Health's Carolinas Medical Center, Charlotte, North Carolina, USA. 4. Department of Biomedical Informatics, Columbia University, New York, New York, USA. 5. Nursing & Healthcare Leadership, University of Washington Tacoma, Tacoma, Washington. 6. Renaissance Computing Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA. 7. National Intrepid Center of Excellence, Walter Reed National Military Medical Center, Bethesda, Maryland, USA. 8. Division of Nursing, Midwifery and Social Work, School of Health Sciences, University of Manchester, Manchester, United Kingdom. 9. Department of Biomedical Informatics and Intelligent Systems Program, University of Pittsburgh, Pittsburgh, Pennsylvania, USA. 10. SentiMetrix, Inc, Bethesda, Maryland, USA. 11. Department of Mechanical and Industrial Engineering, University of Massachusetts at Amherst, Amherst, Massachusetts, USA. 12. MEASURE Evaluation, Carolina Population Center, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA. 13. Carolina Health Informatics Program, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA. 14. Healthcare Delivery Research and Evaluation, Westat, Rockville, Maryland, USA. 15. School of Information and Library Science, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA.
Abstract
OBJECTIVE: This article reports results from a systematic literature review related to the evaluation of data visualizations and visual analytics technologies within the health informatics domain. The review aims to (1) characterize the variety of evaluation methods used within the health informatics community and (2) identify best practices. METHODS: A systematic literature review was conducted following PRISMA guidelines. PubMed searches were conducted in February 2017 using search terms representing key concepts of interest: health care settings, visualization, and evaluation. References were also screened for eligibility. Data were extracted from included studies and analyzed using a PICOS framework: Participants, Interventions, Comparators, Outcomes, and Study Design. RESULTS: After screening, 76 publications met the review criteria. Publications varied across all PICOS dimensions. The most common audience was healthcare providers (n = 43), and the most common data gathering methods were direct observation (n = 30) and surveys (n = 27). About half of the publications focused on static, concentrated views of data with visuals (n = 36). Evaluations were heterogeneous regarding setting and measurements used. DISCUSSION: When evaluating data visualizations and visual analytics technologies, a variety of approaches have been used. Usability measures were used most often in early (prototype) implementations, whereas clinical outcomes were most common in evaluations of operationally-deployed systems. These findings suggest opportunities for both (1) expanding evaluation practices, and (2) innovation with respect to evaluation methods for data visualizations and visual analytics technologies across health settings. CONCLUSION: Evaluation approaches are varied. New studies should adopt commonly reported metrics, context-appropriate study designs, and phased evaluation strategies.
OBJECTIVE: This article reports results from a systematic literature review related to the evaluation of data visualizations and visual analytics technologies within the health informatics domain. The review aims to (1) characterize the variety of evaluation methods used within the health informatics community and (2) identify best practices. METHODS: A systematic literature review was conducted following PRISMA guidelines. PubMed searches were conducted in February 2017 using search terms representing key concepts of interest: health care settings, visualization, and evaluation. References were also screened for eligibility. Data were extracted from included studies and analyzed using a PICOS framework: Participants, Interventions, Comparators, Outcomes, and Study Design. RESULTS: After screening, 76 publications met the review criteria. Publications varied across all PICOS dimensions. The most common audience was healthcare providers (n = 43), and the most common data gathering methods were direct observation (n = 30) and surveys (n = 27). About half of the publications focused on static, concentrated views of data with visuals (n = 36). Evaluations were heterogeneous regarding setting and measurements used. DISCUSSION: When evaluating data visualizations and visual analytics technologies, a variety of approaches have been used. Usability measures were used most often in early (prototype) implementations, whereas clinical outcomes were most common in evaluations of operationally-deployed systems. These findings suggest opportunities for both (1) expanding evaluation practices, and (2) innovation with respect to evaluation methods for data visualizations and visual analytics technologies across health settings. CONCLUSION: Evaluation approaches are varied. New studies should adopt commonly reported metrics, context-appropriate study designs, and phased evaluation strategies.
Authors: Alessandro Liberati; Douglas G Altman; Jennifer Tetzlaff; Cynthia Mulrow; Peter C Gøtzsche; John P A Ioannidis; Mike Clarke; P J Devereaux; Jos Kleijnen; David Moher Journal: BMJ Date: 2009-07-21
Authors: Betty Bekemeier; Seungeun Park; Uba Backonja; India Ornelas; Anne M Turner Journal: J Am Med Inform Assoc Date: 2019-08-01 Impact factor: 4.497
Authors: Danny T Y Wu; Scott Vennemeyer; Kelly Brown; Jason Revalee; Paul Murdock; Sarah Salomone; Ashton France; Katherine Clarke-Myers; Samuel P Hanke Journal: Appl Clin Inform Date: 2019-11-13 Impact factor: 2.342
Authors: Jawad Ahmed Chishtie; Jean-Sebastien Marchand; Luke A Turcotte; Iwona Anna Bielska; Jessica Babineau; Monica Cepoiu-Martin; Michael Irvine; Sarah Munce; Sally Abudiab; Marko Bjelica; Saima Hossain; Muhammad Imran; Tara Jeji; Susan Jaglal Journal: J Med Internet Res Date: 2020-12-03 Impact factor: 5.428