Literature DB >> 30511660

Visually interpretable deep network for diagnosis of breast masses on mammograms.

Seong Tae Kim1, Jae-Hyeok Lee, Hakmin Lee, Yong Man Ro.   

Abstract

Recently, deep learning technology has achieved various successes in medical image analysis studies including computer-aided diagnosis (CADx). However, current CADx approaches based on deep learning have a limitation in interpreting diagnostic decisions. The limited interpretability is a major challenge for practical use of current deep learning approaches. In this paper, a novel visually interpretable deep network framework is proposed to provide diagnostic decisions with visual interpretation. The proposed method is motivated by the fact that the radiologists characterize breast masses according to the breast imaging reporting and data system (BIRADS). The proposed deep network framework consists of a BIRADS guided diagnosis network and a BIRADS critic network. A 2D map, named BIRADS guide map, is generated in the inference process of the deep network. The visual features extracted from the breast masses could be refined by the BIRADS guide map, which helps the deep network to focus on more informative areas. The BIRADS critic network makes the BIRADS guide map to be relevant to the characterization of masses in terms of BIRADS description. To verify the proposed method, comparative experiments have been conducted on public mammogram database. On the independent test set (170 malignant masses and 170 benign masses), the proposed method was found to have significantly higher performance compared to the deep network approach without using the BIRADS guide map (p  <  0.05). Moreover, the visualization was conducted to show the location where the deep network exploited more information. This study demonstrated that the proposed visually interpretable CADx framework could be a promising approach for visually interpreting the diagnostic decision of the deep network.

Entities:  

Mesh:

Year:  2018        PMID: 30511660     DOI: 10.1088/1361-6560/aaef0a

Source DB:  PubMed          Journal:  Phys Med Biol        ISSN: 0031-9155            Impact factor:   3.609


  7 in total

Review 1.  CAD and AI for breast cancer-recent development and challenges.

Authors:  Heang-Ping Chan; Ravi K Samala; Lubomir M Hadjiiski
Journal:  Br J Radiol       Date:  2019-12-16       Impact factor: 3.039

Review 2.  Explainable medical imaging AI needs human-centered design: guidelines and evidence from a systematic review.

Authors:  Haomin Chen; Catalina Gomez; Chien-Ming Huang; Mathias Unberath
Journal:  NPJ Digit Med       Date:  2022-10-19

3.  BI-RADS-NET: AN EXPLAINABLE MULTITASK LEARNING APPROACH FOR CANCER DIAGNOSIS IN BREAST ULTRASOUND IMAGES.

Authors:  Boyu Zhang; Aleksandar Vakanski; Min Xian
Journal:  IEEE Int Workshop Mach Learn Signal Process       Date:  2021-11-15

4.  Using Occlusion-Based Saliency Maps to Explain an Artificial Intelligence Tool in Lung Cancer Screening: Agreement Between Radiologists, Labels, and Visual Prompts.

Authors:  Ziba Gandomkar; Pek Lan Khong; Amanda Punch; Sarah Lewis
Journal:  J Digit Imaging       Date:  2022-04-28       Impact factor: 4.903

Review 5.  Radiomics in Lung Cancer from Basic to Advanced: Current Status and Future Directions.

Authors:  Geewon Lee; Hyunjin Park; So Hyeon Bak; Ho Yun Lee
Journal:  Korean J Radiol       Date:  2020-02       Impact factor: 3.500

6.  Combination of shear wave elastography and BI-RADS in identification of solid breast masses.

Authors:  Xue Zheng; Fei Li; Zhi-Dong Xuan; Yu Wang; Lei Zhang
Journal:  BMC Med Imaging       Date:  2021-12-01       Impact factor: 1.930

Review 7.  Interpretation and visualization techniques for deep learning models in medical imaging.

Authors:  Daniel T Huff; Amy J Weisman; Robert Jeraj
Journal:  Phys Med Biol       Date:  2021-02-02       Impact factor: 3.609

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.