Mohammed A Al-Masni1, Mugahed A Al-Antari2, Jeong-Min Park3, Geon Gi4, Tae-Yeon Kim5, Patricio Rivera6, Edwin Valarezo7, Mun-Taek Choi8, Seung-Moo Han9, Tae-Seong Kim10. 1. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: m.almasani@khu.ac.kr. 2. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: en.mualshz@khu.ac.kr. 3. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: jmpark@khu.ac.kr. 4. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: geon@khu.ac.kr. 5. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: kty@khu.ac.kr. 6. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: patoalejor@khu.ac.kr. 7. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: edgivala@khu.ac.kr. 8. School of Mechanical Engineering, Sungkyunkwan University, Republic of Korea. Electronic address: mtchoi@skku.edu. 9. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: smhan@khu.ac.kr. 10. Department of Biomedical Engineering, College of Electronics and Information, Kyung Hee University, Yongin, Republic of Korea. Electronic address: tskim@khu.ac.kr.
Abstract
BACKGROUND AND OBJECTIVE: Automatic detection and classification of the masses in mammograms are still a big challenge and play a crucial role to assist radiologists for accurate diagnosis. In this paper, we propose a novel Computer-Aided Diagnosis (CAD) system based on one of the regional deep learning techniques, a ROI-based Convolutional Neural Network (CNN) which is called You Only Look Once (YOLO). Although most previous studies only deal with classification of masses, our proposed YOLO-based CAD system can handle detection and classification simultaneously in one framework. METHODS: The proposed CAD system contains four main stages: preprocessing of mammograms, feature extraction utilizing deep convolutional networks, mass detection with confidence, and finally mass classification using Fully Connected Neural Networks (FC-NNs). In this study, we utilized original 600 mammograms from Digital Database for Screening Mammography (DDSM) and their augmented mammograms of 2,400 with the information of the masses and their types in training and testing our CAD. The trained YOLO-based CAD system detects the masses and then classifies their types into benign or malignant. RESULTS: Our results with five-fold cross validation tests show that the proposed CAD system detects the mass location with an overall accuracy of 99.7%. The system also distinguishes between benign and malignant lesions with an overall accuracy of 97%. CONCLUSIONS: Our proposed system even works on some challenging breast cancer cases where the masses exist over the pectoral muscles or dense regions.
BACKGROUND AND OBJECTIVE: Automatic detection and classification of the masses in mammograms are still a big challenge and play a crucial role to assist radiologists for accurate diagnosis. In this paper, we propose a novel Computer-Aided Diagnosis (CAD) system based on one of the regional deep learning techniques, a ROI-based Convolutional Neural Network (CNN) which is called You Only Look Once (YOLO). Although most previous studies only deal with classification of masses, our proposed YOLO-based CAD system can handle detection and classification simultaneously in one framework. METHODS: The proposed CAD system contains four main stages: preprocessing of mammograms, feature extraction utilizing deep convolutional networks, mass detection with confidence, and finally mass classification using Fully Connected Neural Networks (FC-NNs). In this study, we utilized original 600 mammograms from Digital Database for Screening Mammography (DDSM) and their augmented mammograms of 2,400 with the information of the masses and their types in training and testing our CAD. The trained YOLO-based CAD system detects the masses and then classifies their types into benign or malignant. RESULTS: Our results with five-fold cross validation tests show that the proposed CAD system detects the mass location with an overall accuracy of 99.7%. The system also distinguishes between benign and malignant lesions with an overall accuracy of 97%. CONCLUSIONS: Our proposed system even works on some challenging breast cancer cases where the masses exist over the pectoral muscles or dense regions.
Authors: Felipe André Zeiser; Cristiano André da Costa; Tiago Zonta; Nuno M C Marques; Adriana Vial Roehe; Marcelo Moreno; Rodrigo da Rosa Righi Journal: J Digit Imaging Date: 2020-08 Impact factor: 4.056
Authors: Rebecca Sawyer Lee; Jared A Dunnmon; Ann He; Siyi Tang; Christopher Ré; Daniel L Rubin Journal: J Biomed Inform Date: 2020-12-11 Impact factor: 6.317