| Literature DB >> 33913359 |
Zhan Zhang, Daniel Citardi1, Dakuo Wang2, Yegin Genc, Juan Shan1, Xiangmin Fan3.
Abstract
Results of radiology imaging studies are not typically comprehensible to patients. With the advances in artificial intelligence (AI) technology in recent years, it is expected that AI technology can aid patients' understanding of radiology imaging data. The aim of this study is to understand patients' perceptions and acceptance of using AI technology to interpret their radiology reports. We conducted semi-structured interviews with 13 participants to elicit reflections pertaining to the use of AI technology in radiology report interpretation. A thematic analysis approach was employed to analyze the interview data. Participants have a generally positive attitude toward using AI-based systems to comprehend their radiology reports. AI is perceived to be particularly useful in seeking actionable information, confirming the doctor's opinions, and preparing for the consultation. However, we also found various concerns related to the use of AI in this context, such as cyber-security, accuracy, and lack of empathy. Our results highlight the necessity of providing AI explanations to promote people's trust and acceptance of AI. Designers of patient-centered AI systems should employ user-centered design approaches to address patients' concerns. Such systems should also be designed to promote trust and deliver concerning health results in an empathetic manner to optimize the user experience.Entities:
Keywords: acceptability; artificial intelligence; healthcare consumer; patient portals; radiology report
Year: 2021 PMID: 33913359 DOI: 10.1177/14604582211011215
Source DB: PubMed Journal: Health Informatics J ISSN: 1460-4582 Impact factor: 2.681