| Literature DB >> 35124583 |
Chengcheng Liu1, Yi Guo1, Fei Jiang2, Leiming Xu3, Feng Shen3, Zhendong Jin2, Yuanyuan Wang1.
Abstract
BACKGROUND: Automated diagnosis of gastrointestinal stromal tumors' (GISTs) cancerization is an effective way to improve the clinical diagnostic accuracy and reduce possible risks of biopsy. Although deep convolutional neural networks (DCNNs) have proven to be very effective in many image classification problems, there is still a lack of studies on endoscopic ultrasound (EUS) images of GISTs. It remains a substantial challenge mainly due to the data distribution bias of multi-center images, the significant inter-class similarity and intra-class variation, and the insufficiency of training data.Entities:
Keywords: EUS image; GIST; classification; multi-scale image normalization; transfer learning
Mesh:
Year: 2022 PMID: 35124583 PMCID: PMC9028612 DOI: 10.3233/THC-228005
Source DB: PubMed Journal: Technol Health Care ISSN: 0928-7329 Impact factor: 1.205
Figure 1.Example images of four GISTs categories. (a) The very low risk, (b) the low risk, (c) the moderate risk, and (d) the high risk. The red line indicates tumor contours.
Figure 2.The architecture for the proposed framework.
Figure 3.An illustration of severe information lost for small tumors after resizing all images to the same spatial resolution. (a) Before resizing and (b) after resizing.
Figure 4.The architecture of our proposed multi-scale image normalization module.
Detailed information of the feature vector size
| Feature name | CNN feature | Tumor size feature | Demographic feature |
|---|---|---|---|
| Feature size | 2048 | 1 | 2 |
The clinical and demographic information of patients in the dataset. (Values are reported as mean standard deviation)
| Number of patients | Number of images | Age (years) | Gender (male/female) | |
|---|---|---|---|---|
| Very low | 420 | 870 | 57.27 | 147/273 |
| Low | 326 | 643 | 58.33 | 145/181 |
| Moderate | 114 | 219 | 59.19 | 52/62 |
| High | 54 | 92 | 56.48 | 33/21 |
The details of dataset distribution
| The LRG | The HRG | ||||
|---|---|---|---|---|---|
| Very low | Low | Moderate | High | ||
| Training | Number of patients | 319 | 248 | 74 | 38 |
| Number of images | 675 | 495 | 152 | 68 | |
| Validation | Number of patients | 17 | 13 | 18 | 6 |
| Number of images | 17 | 13 | 18 | 6 | |
| Testing | Number of patients | 84 | 65 | 22 | 10 |
| Number of images | 178 | 135 | 49 | 18 | |
Comparison results of different normalization methods
| Method | ACC | AUC | SENS | SPEC |
|---|---|---|---|---|
| Without normalization | 0.774 | 0.8 | 0.563 | 0.819 |
| Normalization using original masks | 0.768 | 0.829 | 0.75 | 0.772 |
| Normalization using dilated masks (the proposed method) |
|
|
|
|
Figure 5.The comparison of different ROI extraction methods. (a) Original image, (b) cropping a patch using the original mask, (c) cropping a patch using the original mask and multiplying with it, and (d) cropping a patch using the dilated mask and multiplying with it (our proposed method).
Comparison results of methods using different features as classifier’s inputs
| Method | ACC | AUC | SENS | SPEC |
|---|---|---|---|---|
| CNN features only | 0.768 | 0.775 | 0.719 | 0.779 |
| CNN features | 0.779 | 0.838 | 0.781 | 0.779 |
| CNN features |
|
|
|
|
Figure 6.ROC curves of methods using different features as classifier’s inputs.
Comparison results of different transfer learning strategies
| Method | Backbone model | ACC | AUC | SENS | SPEC |
|---|---|---|---|---|---|
| Train from scratch | ResNet-50 | 0.746 | 0.835 | 0.719 | 0.752 |
| Transfer learning | VGGNet-16 | 0.768 | 0.840 | 0.781 | 0.765 |
| MobileNetV2 | 0.757 | 0.835 | 0.750 | 0.758 | |
| DenseNet-121 | 0.774 | 0.845 | 0.750 | 0.779 | |
| AlexNet | 0.762 | 0.828 | 0.75 | 0.765 | |
| Inception-v3 | 0.780 | 0.836 | 0.750 | 0.785 | |
| ResNet-18 | 0.774 | 0.854 | 0.813 | 0.765 | |
| ResNet-50 (proposed) |
|
|
|
|
The mean and standard deviation of the experimental results
| ACC | AUC | SENS | SPEC | |
|---|---|---|---|---|
| Mean | 0.809 | 0.851 | 0.806 | 0.809 |
| Standard deviation | 0.006 | 0.010 | 0.026 | 0.010 |