Endre Grøvik1,2, Darvin Yi3, Michael Iv1, Elizabeth Tong1, Daniel Rubin3, Greg Zaharchuk1. 1. Department of Radiology, Stanford University, Stanford, California, USA. 2. Department for Diagnostic Physics, Oslo University Hospital, Oslo, Norway. 3. Department of Biomedical Data Science, Stanford University, Stanford, California, USA.
Abstract
BACKGROUND: Detecting and segmenting brain metastases is a tedious and time-consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging. PURPOSE: To demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep-learning approach based on a fully convolution neural network (CNN). STUDY TYPE: Retrospective. POPULATION: In all, 156 patients with brain metastases from several primary cancers were included. FIELD STRENGTH: 1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.] SEQUENCE: Pretherapy MR images included pre- and postgadolinium T1 -weighted 3D fast spin echo (CUBE), postgadolinium T1 -weighted 3D axial IR-prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR). ASSESSMENT: The ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1-3), multiple (4-10), and many (>10) lesions. STATISTICAL TESTS: Network performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per-metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups. RESULTS: The area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1-3, 4-10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false-positive rate of 8.3/patient (no lesion-size limit) and 3.4/patient (10 mm3 lesion size limit). DATA CONCLUSION: A deep-learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy. LEVEL OF EVIDENCE: 3 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175-182.
BACKGROUND: Detecting and segmenting brain metastases is a tedious and time-consuming task for many radiologists, particularly with the growing use of multisequence 3D imaging. PURPOSE: To demonstrate automated detection and segmentation of brain metastases on multisequence MRI using a deep-learning approach based on a fully convolution neural network (CNN). STUDY TYPE: Retrospective. POPULATION: In all, 156 patients with brain metastases from several primary cancers were included. FIELD STRENGTH: 1.5T and 3T. [Correction added on May 24, 2019, after first online publication: In the preceding sentence, the first field strength listed was corrected.] SEQUENCE: Pretherapy MR images included pre- and postgadolinium T1 -weighted 3D fast spin echo (CUBE), postgadolinium T1 -weighted 3D axial IR-prepped FSPGR (BRAVO), and 3D CUBE fluid attenuated inversion recovery (FLAIR). ASSESSMENT: The ground truth was established by manual delineation by two experienced neuroradiologists. CNN training/development was performed using 100 and 5 patients, respectively, with a 2.5D network based on a GoogLeNet architecture. The results were evaluated in 51 patients, equally separated into those with few (1-3), multiple (4-10), and many (>10) lesions. STATISTICAL TESTS: Network performance was evaluated using precision, recall, Dice/F1 score, and receiver operating characteristic (ROC) curve statistics. For an optimal probability threshold, detection and segmentation performance was assessed on a per-metastasis basis. The Wilcoxon rank sum test was used to test the differences between patient subgroups. RESULTS: The area under the ROC curve (AUC), averaged across all patients, was 0.98 ± 0.04. The AUC in the subgroups was 0.99 ± 0.01, 0.97 ± 0.05, and 0.97 ± 0.03 for patients having 1-3, 4-10, and >10 metastases, respectively. Using an average optimal probability threshold determined by the development set, precision, recall, and Dice score were 0.79 ± 0.20, 0.53 ± 0.22, and 0.79 ± 0.12, respectively. At the same probability threshold, the network showed an average false-positive rate of 8.3/patient (no lesion-size limit) and 3.4/patient (10 mm3 lesion size limit). DATA CONCLUSION: A deep-learning approach using multisequence MRI can automatically detect and segment brain metastases with high accuracy. LEVEL OF EVIDENCE: 3 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2020;51:175-182.
Authors: Daniel N Cagney; Allison M Martin; Paul J Catalano; Amanda J Redig; Nancy U Lin; Eudocia Q Lee; Patrick Y Wen; Ian F Dunn; Wenya Linda Bi; Stephanie E Weiss; Daphne A Haas-Kogan; Brian M Alexander; Ayal A Aizer Journal: Neuro Oncol Date: 2017-10-19 Impact factor: 12.300
Authors: Y W Lui; P D Chang; G Zaharchuk; D P Barboriak; A E Flanders; M Wintermark; C P Hess; C G Filippi Journal: AJNR Am J Neuroradiol Date: 2020-07-30 Impact factor: 3.825
Authors: Antonio Di Ieva; Carlo Russo; Sidong Liu; Anne Jian; Michael Y Bai; Yi Qian; John S Magnussen Journal: Neuroradiology Date: 2021-01-26 Impact factor: 2.804
Authors: D Zopfs; K Laukamp; R Reimer; N Grosse Hokamp; C Kabbasch; J Borggrefe; L Pennig; A C Bunck; M Schlamann; S Lennartz Journal: AJNR Am J Neuroradiol Date: 2022-01-06 Impact factor: 3.825
Authors: Jeffrey D Rudie; David A Weiss; John B Colby; Andreas M Rauschecker; Benjamin Laguna; Steve Braunstein; Leo P Sugrue; Christopher P Hess; Javier E Villanueva-Meyer Journal: Radiol Artif Intell Date: 2021-03-10
Authors: Catherine Coolens; Matt N Gwilliam; Paula Alcaide-Leon; Isabella Maria de Freitas Faria; Fabio Ynoe de Moraes Journal: Cancers (Basel) Date: 2021-05-23 Impact factor: 6.639
Authors: Youngjin Yoo; Pascal Ceccaldi; Siqi Liu; Thomas J Re; Yue Cao; James M Balter; Eli Gibson Journal: J Med Imaging (Bellingham) Date: 2021-05-22