| Literature DB >> 36010346 |
Atsushi Teramoto1, Tomoyuki Shibata2, Hyuga Yamada2, Yoshiki Hirooka2, Kuniaki Saito1, Hiroshi Fujita3.
Abstract
Endoscopy is widely applied in the examination of gastric cancer. However, extensive knowledge and experience are required, owing to the need to examine the lesion while manipulating the endoscope. Various diagnostic support techniques have been reported for this examination. In our previous study, segmentation of invasive areas of gastric cancer was performed directly from endoscopic images and the detection sensitivity per case was 0.98. This method has challenges of false positives and computational costs because segmentation was applied to all healthy images that were captured during the examination. In this study, we propose a cascaded deep learning model to perform categorization of endoscopic images and identification of the invasive region to solve the above challenges. Endoscopic images are first classified as normal, showing early gastric cancer and showing advanced gastric cancer using a convolutional neural network. Segmentation on the extent of gastric cancer invasion is performed for the images classified as showing cancer using two separate U-Net models. In an experiment, 1208 endoscopic images collected from healthy subjects, 533 images collected from patients with early stage gastric cancer, and 637 images from patients with advanced gastric cancer were used for evaluation. The sensitivity and specificity of the proposed approach in the detection of gastric cancer via image classification were 97.0% and 99.4%, respectively. Furthermore, both detection sensitivity and specificity reached 100% in a case-based evaluation. The extent of invasion was also identified at an acceptable level, suggesting that the proposed method may be considered useful for the classification of endoscopic images and identification of the extent of cancer invasion.Entities:
Keywords: classification; convolutional neural network; deep learning; gastric cancer; segmentation
Year: 2022 PMID: 36010346 PMCID: PMC9406996 DOI: 10.3390/diagnostics12081996
Source DB: PubMed Journal: Diagnostics (Basel) ISSN: 2075-4418
Figure 1Outline of the proposed method.
Figure 2Image cropping for regularization.
Figure 3Sample images of the image dataset. (a) healthy subjects; (b) early gastric cancer; (c) advanced gastric cancer.
Figure 4U-Net architecture for gastric cancer segmentation.
Comparison of CNN models for image classification. Values in bold indicate the results of the CNN model with the highest performance.
| ( | |||||||
|
|
|
|
|
|
|
| |
| VGG16 | 0.964 | 0.919 | 0.922 | 0.943 | 0.935 | 0.951 | 0.964 |
| VGG19 | 0.994 | 0.947 | 0.904 | 0.960 | 0.949 | 0.939 | 0.994 |
| InceptionV3 | 0.988 | 0.921 | 0.906 | 0.951 | 0.938 | 0.938 | 0.988 |
| DenseNet121 | 0.994 | 0.998 | 0.945 | 0.982 | 0.979 | 0.970 | 0.994 |
| DenseNet169 | 0.995 | 0.998 | 0.903 | 0.971 | 0.965 | 0.948 | 0.995 |
| DenseNet201 | 0.996 | 0.989 | 0.943 | 0.980 | 0.976 | 0.965 | 0.996 |
| ResNet50 | 0.991 | 0.991 | 0.922 | 0.972 | 0.968 | 0.956 | 0.991 |
| ResNet101 | 0.990 | 0.981 | 0.920 | 0.969 | 0.964 | 0.949 | 0.990 |
| ResNet152 | 0.996 | 0.994 | 0.918 | 0.975 | 0.970 | 0.954 | 0.996 |
| ( | |||||||
|
|
|
|
|
|
|
| |
| VGG16 | 0.976 | 0.937 | 0.960 | 0.952 | 0.958 | 0.979 | 0.976 |
| VGG19 | 1.000 | 0.947 | 0.940 | 0.957 | 0.962 | 0.972 | 1.000 |
| InceptionV3 | 1.000 | 0.958 | 0.980 | 0.973 | 0.979 | 0.986 | 1.000 |
| DenseNet121 | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 | 1.000 |
| DenseNet169 | 1.000 | 1.000 | 0.960 | 0.989 | 0.987 | 0.986 | 1.000 |
| DenseNet201 | 1.000 | 1.000 | 0.980 | 0.995 | 0.993 | 0.993 | 1.000 |
| ResNet50 | 1.000 | 0.989 | 1.000 | 0.995 | 0.996 | 1.000 | 1.000 |
| ResNet101 | 1.000 | 1.000 | 0.980 | 0.995 | 0.993 | 0.993 | 1.000 |
| ResNet152 | 1.000 | 1.000 | 0.980 | 0.995 | 0.993 | 0.993 | 1.000 |
Confusion matrices of classification using DenseNet121.
| ( | ||||
|
| ||||
|
|
|
| ||
| Actual | Healthy | 1201 | 0 | 7 |
| Early gastric cancer | 0 | 531 | 1 | |
| Advanced gastric cancer | 35 | 0 | 602 | |
| ( | ||||
|
| ||||
|
|
|
| ||
| Actual | Healthy | 42 | 0 | 0 |
| Early gastric cancer | 0 | 95 | 0 | |
| Advanced gastric cancer | 0 | 0 | 50 | |
Figure 5Correctly and incorrectly classified images: (a) classification results of early gastric cancer; (b) classification results of advanced gastric cancer; (c) classification results of healthy subjects.
Figure 6Lesions detected and missed in the segmentation phase: (a,b) correctly detected region of advanced gastric cancer; (c) missed region of advanced gastric cancer; (d,e) correctly detected region of early gastric cancer; (f) missed region of early gastric cancer.
Evaluation results of cancer segmentation.
| Di | Ji | |
|---|---|---|
| Early gastric cancer | 0.555 | 0.427 |
| Advanced gastric cancer | 0.716 | 0.611 |
Performance comparison of gastric cancer detection and segmentation.
| Author | Method | Image Dataset | Detection Performance | Segmentation Performance |
|---|---|---|---|---|
| Hirasawa et al. [ | SSD | Original | Sensitivity = 0.922 | - |
| Sakai et al. [ | CNN | Original | Sensitivity = 0.800 | - |
| Shibata et al. [ | Mask R-CNN | Original | Sensitivity = 0.96 | Dice index = 0.54 |
| Teramoto et al. [ | U-Net + CNN | Original | Sensitivity = 0.98 | Dice index = 0.56 |
| Proposed method | Cascade CNN | Original | Sensitivity = 1.00 | Dice index = 0.56 |