| Literature DB >> 35327353 |
Akira Sakai1,2,3,4, Masaaki Komatsu5, Reina Komatsu2,6, Ryu Matsuoka2,6, Suguru Yasutomi1,2, Ai Dozen4, Kanto Shozu4, Tatsuya Arakaki6, Hidenori Machino4,5, Ken Asada4,5, Syuzo Kaneko4,5, Akihiko Sekizawa6, Ryuji Hamamoto3,4.
Abstract
Diagnostic support tools based on artificial intelligence (AI) have exhibited high performance in various medical fields. However, their clinical application remains challenging because of the lack of explanatory power in AI decisions (black box problem), making it difficult to build trust with medical professionals. Nevertheless, visualizing the internal representation of deep neural networks will increase explanatory power and improve the confidence of medical professionals in AI decisions. We propose a novel deep learning-based explainable representation "graph chart diagram" to support fetal cardiac ultrasound screening, which has low detection rates of congenital heart diseases due to the difficulty in mastering the technique. Screening performance improves using this representation from 0.966 to 0.975 for experts, 0.829 to 0.890 for fellows, and 0.616 to 0.748 for residents in the arithmetic mean of area under the curve of a receiver operating characteristic curve. This is the first demonstration wherein examiners used deep learning-based explainable representation to improve the performance of fetal cardiac ultrasound screening, highlighting the potential of explainable AI to augment examiner capabilities.Entities:
Keywords: abnormality detection; congenital heart disease; deep learning; explainable artificial intelligence; fetal cardiac ultrasound screening
Year: 2022 PMID: 35327353 PMCID: PMC8945208 DOI: 10.3390/biomedicines10030551
Source DB: PubMed Journal: Biomedicines ISSN: 2227-9059