| Literature DB >> 35002667 |
Yuteng Xiao1,2, Hongsheng Yin2, Shui-Hua Wang1, Yu-Dong Zhang1.
Abstract
Early diagnosis of pathological brains leads to early interventions in brain diseases, which may help control the illness conditions, prolong the life of patients, and even cure them. Therefore, the classification of brain diseases is a challenging but helpful task. However, it is hard to collect brain images, and the superabundance of images is also a great challenge for computing resources. This study proposes a new approach named TReC: Transferred Residual Networks (ResNet)-Convolutional Block Attention Module (CBAM), a specific model for small-scale samples, to detect brain diseases based on MRI. At first, the ResNet model, which is pre-trained on the ImageNet dataset, serves as initialization. Subsequently, a simple attention mechanism named CBAM is introduced and added into every ResNet residual block. At the same time, the fully connected (FC) layers of the ResNet are replaced with new FC layers, which meet the goal of classification. Finally, all the parameters of our model, such as the ResNet, the CBAM, and new FC layers, are retrained. The effectiveness of the proposed model is evaluated on brain magnetic resonance (MR) datasets for multi-class and two-class tasks. Compared with other state-of-the-art models, our model reaches the best performance for two-class and multi-class tasks on brain diseases.Entities:
Keywords: attention mechanism; magnetic resonance imaging; multi-class classification; pathological brain; transfer learning
Year: 2021 PMID: 35002667 PMCID: PMC8733727 DOI: 10.3389/fninf.2021.781551
Source DB: PubMed Journal: Front Neuroinform ISSN: 1662-5196 Impact factor: 4.081
Figure 1Examples of different types of pathological brain. (a) normal; (b) cerebrovascular disease; (c) neoplastic disease; (d) degenerative disease; (e) infectious disease.
Figure 2The structure of transfer learning with ResNet-CBAM model.
Figure 3Schematic diagram of the convolutional operation.
Figure 4Structure of residual block.
Figure 5Structure of ResNet-CBAM model.
Pseudocode of transferred ResNet-CBAM (TReC) algorithm.
Configuration of 5-fold cross-validation.
|
|
|
|
|---|---|---|
| Two-class | 270 | 67 |
| Five-class | 158 | 39 |
Figure 6The schematic diagram of 5-fold cross-validation.
The statistics for two-class classification.
|
|
| |
|---|---|---|
|
|
| |
| Pathological brain |
|
|
| Normal brain |
|
|
Comparison with state-of-the-art methods for the two-class task.
|
|
|
|
|
|
|
|
|---|---|---|---|---|---|---|
| Lu et al. ( | ResNet-ELM-CBA | 95.71% | 94.29% | – | – | 95.00% |
| Lu et al. ( | MobileNet-RVFL-CBA | 98.89% | 91.67% | – | – | 96.00% |
| Lu et al. ( | BN-AlexNet-ELM-CBA | 97.14% | 95.71% | 96.17% | 96.50% | 96.43% |
| Talo et al. ( | Deep transfer ResNet | – | – | – | – | 100.00% |
| Lu et al. ( | AlexNet+TL | 100.00% | 100.00% | – | – | 100.00% |
| Ours | TReC | 100.00% | 100.00% | 100.00% | 100.00% | 100.00% |
Performance for different folds of proposed model with different layers.
|
|
|
|
|
|
|
|
|---|---|---|---|---|---|---|
| Transferred ResNet152-CBAM | 1 | 98.46 | 99.46 | 93.33 | 95.20 | 97.44 |
| 2 | 87.56 | 97.03 | 94.03 | 89.53 | 89.74 | |
| 3 | 79.76 | 95.20 | 94.00 | 84.74 | 84.62 | |
| 4 | 88.93 | 96.62 | 91.19 | 89.18 | 87.18 | |
| 5 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | |
| Average | 90.94 | 97.66 | 94.51 | 91.73 | 91.80 | |
| Transferred ResNet101-CBAM | 1 | 91.67 | 97.80 | 95.33 | 92.84 | 92.31 |
| 2 | 92.14 | 97.88 | 93.11 | 92.32 | 92.31 | |
| 3 | 97.50 | 99.33 | 98.00 | 97.61 | 97.44 | |
| 4 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | |
| 5 | 86.67 | 97.95 | 89.66 | 87.39 | 92.31 | |
| Average | 93.60 | 98.59 | 95.22 | 94.03 | 94.87 | |
| Transferred ResNet18-CBAM | 1 | 95.14 | 98.47 | 96.03 | 95.52 | 94.87 |
| 2 | 95.14 | 98.64 | 96.00 | 95.41 | 94.87 | |
| 3 | 96.67 | 98.86 | 93.33 | 94.18 | 94.87 | |
| 4 | 95.00 | 99.43 | 96.00 | 94.92 | 97.44 | |
| 5 | 85.33 | 97.90 | 90.67 | 86.91 | 92.31 | |
| Average | 93.46 | 98.66 | 94.41 | 93.39 | 94.87 | |
| Transferred ResNet50-CBAM | 1 | 98.18 | 99.20 | 98.67 | 98.36 | 97.44 |
| 2 | 96.67 | 99.05 | 98.95 | 97.64 | 97.44 | |
| 3 | 97.14 | 99.20 | 98.67 | 97.77 | 97.44 | |
| 4 | 93.81 | 98.62 | 96.36 | 94.74 | 94.87 | |
| 5 | 97.14 | 99.17 | 98.75 | 97.82 | 97.44 | |
| Average | 96.59 | 99.05 |
| 97.27 | 96.92 | |
| Transferred ResNet34-CBAM | 1 | 95.00 | 99.20 | 98.67 | 96.45 | 97.44 |
| 2 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | |
| 3 | 97.14 | 99.36 | 97.78 | 97.29 | 97.44 | |
| 4 | 95.56 | 98.52 | 95.97 | 95.36 | 94.87 | |
| 5 | 96.67 | 99.13 | 98.82 | 97.58 | 97.44 | |
| Average |
|
| 98.25 |
|
|
The bold values indicate the best performance on the particular metrics.
Figure 7Averaged results for different layers.
Figure 8Confusion matrixes of transferred ResNet18-CBAM. (A) fold-1; (B) fold-2; (C) fold-3; (D) fold-4; (E) fold-5.
Figure 12Confusion matrixes of transferred ResNet152-CBAM. (A) fold-1; (B) fold-2; (C) fold-3; (D) fold-4; (E) fold-5.
Figure 13The evaluated metrics between transferred ResNet34 models with and without CBAM.
Comparison with state-of-the-art methods for the multi-class task.
|
|
|
|
|---|---|---|
| Nayak et al. ( | FCEntF-II + K-ELM | 93.00% |
| Talo et al. ( | Transfer learning with ResNet50 | 95.23% |
| TReC (Ours) | Transferred ResNet34-CBAM | 97.44% |