| Literature DB >> 35252849 |
Shoji Kido1, Shunske Kidera2, Yasushi Hirano3, Shingo Mabu2, Tohru Kamiya4, Nobuyuki Tanaka5, Yuki Suzuki1, Masahiro Yanagawa6, Noriyuki Tomiyama6.
Abstract
In computer-aided diagnosis systems for lung cancer, segmentation of lung nodules is important for analyzing image features of lung nodules on computed tomography (CT) images and distinguishing malignant nodules from benign ones. However, it is difficult to accurately and robustly segment lung nodules attached to the chest wall or with ground-glass opacities using conventional image processing methods. Therefore, this study aimed to develop a method for robust and accurate three-dimensional (3D) segmentation of lung nodule regions using deep learning. In this study, a nested 3D fully connected convolutional network with residual unit structures was proposed, and designed a new loss function. Compared with annotated images obtained under the guidance of a radiologist, the Dice similarity coefficient (DS) and intersection over union (IoU) were 0.845 ± 0.008 and 0.738 ± 0.011, respectively, for 332 lung nodules (lung adenocarcinoma) obtained from 332 patients. On the other hand, for 3D U-Net and 3D SegNet, the DS was 0.822 ± 0.009 and 0.786 ± 0.011, respectively, and the IoU was 0.711 ± 0.011 and 0.660 ± 0.012, respectively. These results indicate that the proposed method is significantly superior to well-known deep learning models. Moreover, we compared the results obtained from the proposed method with those obtained from conventional image processing methods, watersheds, and graph cuts. The DS and IoU results for the watershed method were 0.628 ± 0.027 and 0.494 ± 0.025, respectively, and those for the graph cut method were 0.566 ± 0.025 and 0.414 ± 0.021, respectively. These results indicate that the proposed method is significantly superior to conventional image processing methods. The proposed method may be useful for accurate and robust segmentation of lung nodules to assist radiologists in the diagnosis of lung nodules such as lung adenocarcinoma on CT images.Entities:
Keywords: SegNet; U-Net; computer-aided diagnosis; deep learning; graph cut; lung nodule; segmentation; watershed
Year: 2022 PMID: 35252849 PMCID: PMC8892185 DOI: 10.3389/frai.2022.782225
Source DB: PubMed Journal: Front Artif Intell ISSN: 2624-8212
Figure 1Architecture of the proposed nested three-dimensional (3D) fully connected convolutional network. The connections are indicated by the red circles, where the encoder and decoder are connected by concatenation.
Figure 2Architecture of the residual unit. (A) Conventional feed-forward neural network and (B) residual unit.
Comparison of the proposed method with four segmentation methods.
|
|
| |
|---|---|---|
| Proposed | 0.845 ± 0.008 | 0.738 ± 0.011 |
| 3D U-Net | 0.822 ± 0.009 | 0.711 ± 0.011 |
| 3D SegNet | 0.786 ± 0.011 | 0.660 ± 0.012 |
| Watershed | 0.628 ± 0.027 | 0.494 ± 0.025 |
| Graph cut | 0.566 ± 0.025 | 0.414 ± 0.021 |
Comparison with the proposed method.
P < 0.01.
P < 0.001.
Average processing time per case for the proposed model, 3D U-Net, and 3D SegNet.
|
| |
|---|---|
| Proposed | 0.283 ± 0.002 |
| 3D U-Net | 0.160 ± 0.001 |
| 3D SegNet | 0.083 ± 0.001 |
Comparison with the proposed method.
P < 0.0001.
Figure 3Examples of segmentation results in the case of a GGO nodule.
Comparison of the proposed method with four segmentation methods in the case of a GGO nodule.
|
|
| |
|---|---|---|
| Proposed | 0.886 | 0.795 |
| 3D U-Net | 0.849 | 0.738 |
| 3D SegNet | 0.835 | 0.719 |
| Watershed | 0.310 | 0.184 |
| Graph cut | 0.573 | 0.401 |
Figure 4Examples of segmentation results in the case of a nodule attached to the chest wall.
Comparison of the proposed method with four segmentation methods in the case of a nodule attached to the chest wall.
|
|
| |
|---|---|---|
| Proposed | 0.772 | 0.628 |
| 3D U-Net | 0.559 | 0.388 |
| 3D SegNet | 0.528 | 0.359 |
| Watershed | 0.000 | 0.000 |
| Graph cut | 0.504 | 0.336 |
Figure 5An example of extraction results when the value of λ was changed. Under-extraction was observed when only Dice loss was used as the loss function (λ = 0.0).
Figure 6An example of extraction results when the value of λ was changed. Over-extraction was observed when only binary cross entropy was used as the loss function (λ = 1.0).