| Literature DB >> 33336093 |
Karim Aderghal1,2, Karim Afdel2, Jenny Benois-Pineau1, Gwénaëlle Catheline3.
Abstract
BACKGROUND: Alzheimer's Disease (AD) is a neurodegenerative disease characterized by progressive loss of memory and general decline in cognitive functions. Multi-modal imaging such as structural MRI and DTI provide useful information for the classification of patients on the basis of brain biomarkers. Recently, CNN methods have emerged as powerful tools to improve classification using images. NEWEntities:
Keywords: Alzheimer's Disease; Applied computing; Applied computing in medical science; Artificial intelligence; Computing methodology; Convolutional Neural Network (CNN); Diffusion Tensor Imaging (DTI); Image classification; Image processing; Magnetic Resonance Imaging (MRI); Medical imaging; Multi-modality; Signal processing; Transfer learning
Year: 2020 PMID: 33336093 PMCID: PMC7733012 DOI: 10.1016/j.heliyon.2020.e05652
Source DB: PubMed Journal: Heliyon ISSN: 2405-8440
Demographic description of the ADNI dataset group. Values are reported as mean and ± standard deviation (* Subjects with both modalities).
| Classes | # Subjects | Age [range] / | Gender (#F/ #M) | MMSE [range] / | |
|---|---|---|---|---|---|
| ADNI-1 | AD | 188 | [55.18, 90.99] / 75.37 ± 7.52 | 99/89 | 23.3 ± 2.03 |
| MCI | 399 | [54.63, 89.38] / 74.89 ± 7.30 | 256/143 | 27.0 ± 1.78 | |
| NC | 228 | [60.02, 89.74] / 75.98 ± 5.02 | 118/110 | 29.1 ± 1.00 | |
| ADNI-2/Go | AD | *48 | [55.73, 90.87] / 75.60 ± 8.63 | 28/20 | 23.0 ± 2.42 |
| MCI | *108 | [55.33, 93.62] / 74.40 ± 7.47 | 66/42 | 27.4 ± 1.99 | |
| NC | *58 | [59.91, 93.25] / 74.91 ± 5.90 | 28/30 | 28.9 ± 1.18 | |
| ADNI-3 | AD | *16 | [55.26, 86.10] / 74.63 ± 9.92 | 4/12 | - |
| MCI | *165 | [55.88, 95.93] / 75.01 ± 7.91 | 71/94 | - | |
| NC | *341 | [55.79, 95.39] / 73.52 ± 7.82 | 209/132 | - |
Figure 1Schematic diagram of dataset preprocessing: 1) registration of all MRI scans on MNI space, followed with intensity normalization. 2) ROI selection process using the Atlas AAL for both hippocampal regions. 3) 2D-slice extraction from selected 3D-volume. 4) feeding the CNN networks.
Figure 2Illustration of the co-registration process includes spatial normalization and skull stripping.
Figure 3Illustration of the 2-D+ε Approach from each projection.
Figure 4Example of the hippocampal region with different projections for two Subjects: (A) - MD and (B) - sMRI.
Figure 5The scheme of Transfer Learning for parameters optimization from sMRI to MD-DTI modality. In the figure, an example of the proposed architecture for 2-way classification.
Algorithm 1DA pseudo algorithm.
Number of subjects for each class, with its corresponding augmentation, (⁎ Both modalities).
| Before augmentation | After augmentation | ||||||
|---|---|---|---|---|---|---|---|
| AD | MCI | NC | ADa | MCIa | NCa | ||
| Dataset 1 | Train | 146 | 446 | 48200 | 48200 | 48200 | |
| Valid | 42 | 117 | 12600 | 12600 | 12600 | ||
| Test | 64 | 64 | 64 | 640 | 640 | 640 | |
| 252 | 672 | 627 | 61440 | 61440 | 61440 | ||
| Dataset 2⁎ | Train | 31 | 198 | 29900 | 29900 | 29900 | |
| Valid | 13 | 55 | 8000 | 8000 | 8000 | ||
| Test | 20 | 20 | 20 | 200 | 200 | 200 | |
| 64 | 273 | 399 | 38100 | 38100 | 38100 | ||
Classification results for each single projection and fusion by majority vote on sMRI dataset.
| Tasks | Projection | Acc (%) | Sen (%) | Spe (%) | BAcc (%) |
|---|---|---|---|---|---|
| AD vs. NC | Sagittal | 82.92% | 85.72% | 79.84% | 82.78% |
| Coronal | 81.04% | 83.20% | 78.63% | 80.41% | |
| Axial | 79.81% | 81.31% | 77.65% | 79.48% | |
| fusion ⁎ | |||||
| AD vs. MCI | Sagittal | 66.73% | 68.52% | 63.91% | 66.21% |
| Coronal | 67.61% | 71.25% | 61.88% | 66.56% | |
| Axial | 65.55% | 66.60% | 61.57% | 64.08% | |
| fusion ⁎ | |||||
| MCI vs. NC | Sagittal | 65.51% | 61.64% | 69.48% | 65.56% |
| Coronal | 66.45% | 60.27% | 65.11% | 62.69% | |
| Axial | 63.89% | 59.15% | 64.57% | 61.86% | |
| fusion ⁎ |
Figure 6Example of Transfer learning for single network - comparison of AD/NC: a) Transfer from sMRI to MD-DTI, b) Training from scratch on MD-DTI Dataset.
Figure 7Example of Transfer learning - comparison of AD/MCI: a) Transfer from sMRI to MD-DTI, b) Training from scratch on MD-DTI Dataset.
Figure 8Temporal loss curves comparison for AD/NC classification: a) From sMRI to MD-DTI transfer learning with reduced over-fitting - b) Training from scratch with little over-fitting.
Binary classification results with Transfer Learning from sMRI to MD-DTI data and fusion (* both modalities).
| Tasks | Modalities | Projection | Acc (%) | Sen (%) | Spe (%) | BAcc (%) |
|---|---|---|---|---|---|---|
| AD vs. NC | MD | Sagittal | 84.93% | 86.07% | 81.23% | 83.65% |
| Coronal | 80.62% | 81.15% | 79.75% | 80.45% | ||
| Axial | 79.50% | 81.91% | 78.04% | 79.97% | ||
| Fusion (*) | ||||||
| AD vs. MCI | MD | Sagittal | 65.12% | 72.25% | 68.44% | 70.34% |
| Coronal | 72.87% | 76.58% | 71.93% | 74.25% | ||
| Axial | 64.79% | 69.14% | 66.28% | 67.71% | ||
| Fusion (*) | ||||||
| MCI vs. NC | MD | Sagittal | 65.59% | 66.48% | 69.32% | 67.90% |
| Coronal | 69.14% | 67.97% | 70.82% | 69.39% | ||
| Axial | 64.98% | 67.71% | 71.06% | 69.38% | ||
| Fusion (*) |
Classification results with One-level scheme Transfer Learning: From MNIST to SMRI & From MNIST to DTI-MD data.
| Tasks | Modalities | Projection | Acc (%) | Sen (%) | Spe (%) | BAcc (%) |
|---|---|---|---|---|---|---|
| AD vs. NC | SMRI | Sagittal | 80.02% | 81.95% | 79.26% | 80.60% |
| Coronal | 79.94% | 80.59% | 78.18% | 79.38% | ||
| Axial | 79.11% | 81.05% | 79.42% | 80.23% | ||
| MD | Sagittal | 81.85% | 83.24% | 79.49% | 81.36% | |
| Coronal | 79.22% | 83.01% | 78.56% | 80.78% | ||
| Axial | 78.69% | 82.44% | 79.71% | 81.07% | ||
| Fusion (*) | ||||||
| AD vs. MCI | SMRI | Sagittal | 65.32% | 66.81% | 64.52% | 65.66% |
| Coronal | 64.57% | 65.63% | 63.74% | 64.68% | ||
| Axial | 61.74% | 63.05% | 59.88% | 61.46% | ||
| MD | Sagittal | 64.95% | 70.19% | 66.45% | 68.32% | |
| Coronal | 68.60% | 72.51% | 67.96% | 70.23% | ||
| Axial | 62.36% | 68.24% | 62.47% | 65.35% | ||
| Fusion (*) | ||||||
| MCI vs. NC | SMRI | Sagittal | 64.75% | 62.35% | 66.72% | 64.53% |
| Coronal | 60.49% | 58.62% | 63.40% | 61.01% | ||
| Axial | 60.15% | 59.14% | 62.63% | 60.88% | ||
| MD | Sagittal | 63.59% | 63.18% | 66.93% | 65.05% | |
| Coronal | 67.14% | 64.24% | 69.86% | 67.05% | ||
| Axial | 64.98% | 63.91% | 68.55% | 66.23% | ||
| Fusion (*) |
Classification results with Two-level scheme Transfer Learning: From MNIST to DTI-MD crossed sMRI data.
| Tasks | Modalities | Projection | Acc (%) | Sen (%) | Spe (%) | BAcc (%) |
|---|---|---|---|---|---|---|
| AD vs. NC | MD | Sagittal | 85.14% | 87.95% | 84.14% | 86.04% |
| Coronal | 82.57% | 84.55% | 80.84% | 82.69% | ||
| Axial | 81.21% | 84.26% | 81.10% | 82.68% | ||
| Fusion (*) | ||||||
| AD vs. MCI | MD | Sagittal | 70.84% | 77.25% | 73.51% | 75.38% |
| Coronal | 76.53% | 78.39% | 76.64% | 77.51% | ||
| Axial | 69.21% | 73.08% | 68.52% | 70.8% | ||
| Fusion (*) | ||||||
| MCI vs. NC | MD | Sagittal | 71.09% | 70.15% | 74.95% | 72.55% |
| Coronal | 75.34% | 72.41% | 76.39% | 74.40% | ||
| Axial | 70.21% | 69.10% | 73.64% | 71.37% | ||
| Fusion (*) |
Comparison of classification performances reported in the literature.
| Study | Subjects | Classifier | Modality | Approach | Accuracy | ||||
|---|---|---|---|---|---|---|---|---|---|
| AD | MCI | NC | AD vs. NC | AD vs. MCI | MCI vs. NC | ||||
| Sarraf et al. | 52 | - | 92 | CNN - LeNET-5 | sMRI | 2D slice-level | 97.88% | - | - |
| 211 | - | 91 | CNN - GoogleNet | sMRI | 2D slice-level | 98.74% | - | - | |
| Khvostikov et al. | 53 | 228 | 250 | CNN | sMRI+DTI | 3D ROI-based | 93.3% | 86.7% | 73.3% |
| Gupta et al. | 200 | 411 | 232 | CNN | sMRI | 2D slice-level | 93.80% | 86.30% | 83.30% |
| Billones et al. | 53 | 228 | 250 | CNN - VGG-Net | sMRI | 2D slice-level | 98.33% | 93.89% | 91.67% |
| Lee et al. | 192 | 398 | 229 | CNN - Alexnet | sMRI | 2D slice-level | 98.74% | - | - |
| 100 | - | 316 | CNN - Alexnet | sMRI | 2D slice-level | 95.35% | - | - | |
| Valliani et al. | 188 | 243 | 229 | CNN - ResNet | sMRI | 2D slice-level | 81.3% | - | - |
| Cheng et al. | 199 | - | 229 | CNN | sMRI | 3D subject-level | 83.88% | - | - |
| Glozman et al. | 200 | 132 | 221 | CNN - AlexNet | sMRI | 2D slice-level | 66.51% | - | - |
| Hon et al. | 100 | - | 100 | CNN - VGG-Net | sMRI | 2D slice-level | 92.30% | - | - |
| - | CNN - Inception V4 | sMRI | 2D slice-level | 96.25% | - | - | |||
| Payan et al. | 755 | 755 | 755 | CNN | sMRI | 3D subject-level | 95.39% | 86.84% | 92.13% |
| Lian et al. | 358 | - | 429 | H-FCN | sMRI | 3D patch-level | 90,00% | - | - |
| Proposed cross-modal transfer (1) | 252 | 672 | 627 | CNN | sMRI+DTI | 2D ROI-based | 92.11% | 74.41% | 73.91% |
| Cross-domain One-level transfer (2) | 64 | 273 | 399 | CNN - LeNet | sMRI+DTI | 2D ROI-based | 86.83% | 71.45% | 69.85% |
| Proposed Two-level transfer (3) | 64 | 273 | 399 | CNN - LeNet | sMRI+DTI | 2D ROI-based | 92.30% | 79.16% | 78.48% |