Literature DB >> 35951606

CTG-Net: Cross-task guided network for breast ultrasound diagnosis.

Kaiwen Yang1,2, Aiga Suzuki2, Jiaxing Ye2, Hirokazu Nosato2, Ayumi Izumori3, Hidenori Sakanashi1,2.   

Abstract

Deep learning techniques have achieved remarkable success in lesion segmentation and classification between benign and malignant tumors in breast ultrasound images. However, existing studies are predominantly focused on devising efficient neural network-based learning structures to tackle specific tasks individually. By contrast, in clinical practice, sonographers perform segmentation and classification as a whole; they investigate the border contours of the tissue while detecting abnormal masses and performing diagnostic analysis. Performing multiple cognitive tasks simultaneously in this manner facilitates exploitation of the commonalities and differences between tasks. Inspired by this unified recognition process, this study proposes a novel learning scheme, called the cross-task guided network (CTG-Net), for efficient ultrasound breast image understanding. CTG-Net integrates the two most significant tasks in computerized breast lesion pattern investigation: lesion segmentation and tumor classification. Further, it enables the learning of efficient feature representations across tasks from ultrasound images and the task-specific discriminative features that can greatly facilitate lesion detection. This is achieved using task-specific attention models to share the prediction results between tasks. Then, following the guidance of task-specific attention soft masks, the joint feature responses are efficiently calibrated through iterative model training. Finally, a simple feature fusion scheme is used to aggregate the attention-guided features for efficient ultrasound pattern analysis. We performed extensive experimental comparisons on multiple ultrasound datasets. Compared to state-of-the-art multi-task learning approaches, the proposed approach can improve the Dice's coefficient, true-positive rate of segmentation, AUC, and sensitivity of classification by 11%, 17%, 2%, and 6%, respectively. The results demonstrate that the proposed cross-task guided feature learning framework can effectively fuse the complementary information of ultrasound image segmentation and classification tasks to achieve accurate tumor localization. Thus, it can aid sonographers to detect and diagnose breast cancer.

Entities:  

Mesh:

Year:  2022        PMID: 35951606      PMCID: PMC9371312          DOI: 10.1371/journal.pone.0271106

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.752


  29 in total

1.  Computer-Aided Diagnosis for Breast Ultrasound Using Computerized BI-RADS Features and Machine Learning Methods.

Authors:  Juan Shan; S Kaisar Alam; Brian Garra; Yingtao Zhang; Tahira Ahmed
Journal:  Ultrasound Med Biol       Date:  2016-01-21       Impact factor: 2.998

2.  Breast mass lesions: computer-aided diagnosis models with mammographic and sonographic descriptors.

Authors:  Jonathan L Jesneck; Joseph Y Lo; Jay A Baker
Journal:  Radiology       Date:  2007-06-11       Impact factor: 11.105

3.  CE-Net: Context Encoder Network for 2D Medical Image Segmentation.

Authors:  Zaiwang Gu; Jun Cheng; Huazhu Fu; Kang Zhou; Huaying Hao; Yitian Zhao; Tianyang Zhang; Shenghua Gao; Jiang Liu
Journal:  IEEE Trans Med Imaging       Date:  2019-03-07       Impact factor: 10.048

4.  SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation.

Authors:  Vijay Badrinarayanan; Alex Kendall; Roberto Cipolla
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2017-01-02       Impact factor: 6.226

5.  3D Multi-Attention Guided Multi-Task Learning Network for Automatic Gastric Tumor Segmentation and Lymph Node Classification.

Authors:  Yongtao Zhang; Haimei Li; Jie Du; Jing Qin; Tianfu Wang; Yue Chen; Bing Liu; Wenwen Gao; Guolin Ma; Baiying Lei
Journal:  IEEE Trans Med Imaging       Date:  2021-06-01       Impact factor: 10.048

Review 6.  Screening for breast cancer.

Authors:  D M Eddy
Journal:  Ann Intern Med       Date:  1989-09-01       Impact factor: 25.391

7.  BI-RADS-NET: AN EXPLAINABLE MULTITASK LEARNING APPROACH FOR CANCER DIAGNOSIS IN BREAST ULTRASOUND IMAGES.

Authors:  Boyu Zhang; Aleksandar Vakanski; Min Xian
Journal:  IEEE Int Workshop Mach Learn Signal Process       Date:  2021-11-15

Review 8.  Transfer Learning in Breast Cancer Diagnoses via Ultrasound Imaging.

Authors:  Gelan Ayana; Kokeb Dese; Se-Woon Choe
Journal:  Cancers (Basel)       Date:  2021-02-10       Impact factor: 6.639

9.  Global burden of breast cancer and attributable risk factors in 195 countries and territories, from 1990 to 2017: results from the Global Burden of Disease Study 2017.

Authors:  Na Li; Yujiao Deng; Linghui Zhou; Tian Tian; Si Yang; Ying Wu; Yi Zheng; Zhen Zhai; Qian Hao; Dingli Song; Dai Zhang; Huafeng Kang; Zhijun Dai
Journal:  J Hematol Oncol       Date:  2019-12-21       Impact factor: 17.388

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.