Chunxiao Li1, Yuanfan Guo2, Liqiong Jia3, Minghua Yao1, Sihui Shao1, Jing Chen1, Yi Xu2, Rong Wu1. 1. Department of Ultrasound, Shanghai General Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, China. 2. Shanghai Key Lab of Digital Media Processing and Transmission, Shanghai Jiao Tong University, Shanghai, China. 3. Department of Ultrasound, Zhongshan Hospital Wusong Branch, Fudan University, Shanghai, China.
Abstract
Purpose: A convolutional neural network (CNN) can perform well in either of two independent tasks [classification and axillary lymph-node metastasis (ALNM) prediction] based on breast ultrasound (US) images. This study is aimed to investigate the feasibility of performing the two tasks simultaneously. Methods: We developed a multi-task CNN model based on a self-built dataset containing 5911 breast US images from 2131 patients. A hierarchical loss (HL) function was designed to relate the two tasks. Sensitivity, specificity, accuracy, precision, F1-score, and analyses of receiver operating characteristic (ROC) curves and heatmaps were calculated. A radiomics model was built by the PyRadiomics package. Results: The sensitivity, specificity and area under the ROC curve (AUC) of our CNN model for classification and ALNM tasks were 83.5%, 71.6%, 0.878 and 76.9%, 78.3%, 0.836, respectively. The inconsistency error of ALNM prediction corrected by HL function decreased from 7.5% to 4.2%. Predictive ability of the CNN model for ALNM burden (≥3 or ≥4) was 77.3%, 62.7%, and 0.752, and 66.6%, 76.8%, and 0.768, respectively, for sensitivity, specificity and AUC. Conclusion: The proposed multi-task CNN model highlights its novelty in simultaneously distinguishing breast lesions and indicating nodal burden through US, which is valuable for "personalized" treatment.
Purpose: A convolutional neural network (CNN) can perform well in either of two independent tasks [classification and axillary lymph-node metastasis (ALNM) prediction] based on breast ultrasound (US) images. This study is aimed to investigate the feasibility of performing the two tasks simultaneously. Methods: We developed a multi-task CNN model based on a self-built dataset containing 5911 breast US images from 2131 patients. A hierarchical loss (HL) function was designed to relate the two tasks. Sensitivity, specificity, accuracy, precision, F1-score, and analyses of receiver operating characteristic (ROC) curves and heatmaps were calculated. A radiomics model was built by the PyRadiomics package. Results: The sensitivity, specificity and area under the ROC curve (AUC) of our CNN model for classification and ALNM tasks were 83.5%, 71.6%, 0.878 and 76.9%, 78.3%, 0.836, respectively. The inconsistency error of ALNM prediction corrected by HL function decreased from 7.5% to 4.2%. Predictive ability of the CNN model for ALNM burden (≥3 or ≥4) was 77.3%, 62.7%, and 0.752, and 66.6%, 76.8%, and 0.768, respectively, for sensitivity, specificity and AUC. Conclusion: The proposed multi-task CNN model highlights its novelty in simultaneously distinguishing breast lesions and indicating nodal burden through US, which is valuable for "personalized" treatment.
Authors: Armando E Giuliano; James L Connolly; Stephen B Edge; Elizabeth A Mittendorf; Hope S Rugo; Lawrence J Solin; Donald L Weaver; David J Winchester; Gabriel N Hortobagyi Journal: CA Cancer J Clin Date: 2017-03-14 Impact factor: 508.702
Authors: Nehmat Houssami; Suzanne C E Diepstraten; Hiram S Cody; Robin M Turner; Ali R Sever Journal: Anticancer Res Date: 2014-03 Impact factor: 2.480
Authors: Armando E Giuliano; Karla Ballman; Linda McCall; Peter Beitsch; Pat W Whitworth; Peter Blumencranz; A Marilyn Leitch; Sukamal Saha; Monica Morrow; Kelly K Hunt Journal: Ann Surg Date: 2016-09 Impact factor: 12.969
Authors: René Aloísio da Costa Vieira; Gabriele Biller; Gilberto Uemura; Carlos Alberto Ruiz; Maria Paula Curado Journal: Clinics (Sao Paulo) Date: 2017-04 Impact factor: 2.365