| Literature DB >> 31156359 |
Shuihua Wang1, Chaosheng Tang1, Junding Sun1, Yudong Zhang1,2.
Abstract
Cerebral micro-bleedings (CMBs) are small chronic brain hemorrhages that have many side effects. For example, CMBs can result in long-term disability, neurologic dysfunction, cognitive impairment and side effects from other medications and treatment. Therefore, it is important and essential to detect CMBs timely and in an early stage for prompt treatment. In this research, because of the limited labeled samples, it is hard to train a classifier to achieve high accuracy. Therefore, we proposed employing Densely connected neural network (DenseNet) as the basic algorithm for transfer learning to detect CMBs. To generate the subsamples for training and test, we used a sliding window to cover the whole original images from left to right and from top to bottom. Based on the central pixel of the subsamples, we could decide the target value. Considering the data imbalance, the cost matrix was also employed. Then, based on the new model, we tested the classification accuracy, and it achieved 97.71%, which provided better performance than the state of art methods.Entities:
Keywords: CMB detection; DenseNet; cost matrix; deep learning; transfer learning
Year: 2019 PMID: 31156359 PMCID: PMC6533830 DOI: 10.3389/fnins.2019.00422
Source DB: PubMed Journal: Front Neurosci ISSN: 1662-453X Impact factor: 4.677
FIGURE 1A toy example of convolution operation in CNN with stride size as 1, in which, the left matrix means the input, the second matrix means the kernel, and the right matrix stands for the generated feature map after convolution operation. It is different from the convolution defined in purely mathematic terms.
FIGURE 2Structure of the DenseBlock (5 layers and each layer takes feature maps from all previous layers).
FIGURE 3The structure of the DenseNet.
FIGURE 4Non-CMB samples.
FIGURE 5CMB samples.
Dividing of the dataset for training and testing.
| Train | Test | |
|---|---|---|
| CMB | 58, 847 | 10000 |
| Non-CMB | 56, 572, 536 | 10000 |
FIGURE 6Images padded for DenseNet.
FIGURE 7Flowchart of DenseNet 201.
FIGURE 8Different cases of transfer learning (the original fully connected layer with 1000 neurons was replaced by a new fully connected layer with 2 neurons).
Confusion matrix of detected CMB and Non-CMB.
| Predicted | ||
|---|---|---|
| Actual | CMBs | Non-CMBs |
| CMBs (10000) | 9777 | 223 |
| Non-CMBs (10000) | 236 | 9764 |
Measurements value CMB detection based on transfer learning of DenseNet (Units: %).
| Measurements | Sensitivity | Specificity | Accuracy | Precision |
|---|---|---|---|---|
| R 1 | 96.69 | 96.73 | 96.71 | 96.72 |
| R 2 | 97.82 | 97.87 | 97.84 | 97.90 |
| R 3 | 98.71 | 98.51 | 98.61 | 98.52 |
| R 4 | 96.71 | 96.27 | 96.49 | 96.29 |
| R 5 | 96.69 | 96.19 | 96.44 | 96.22 |
| R 6 | 96.93 | 96.99 | 96.96 | 97.00 |
| R 7 | 98.44 | 98.40 | 98.42 | 98.39 |
| R 8 | 98.77 | 98.58 | 98.67 | 98.58 |
| R 9 | 98.71 | 98.62 | 98.67 | 98.62 |
| R 10 | 98.30 | 98.22 | 98.26 | 98.22 |
| 97.78 ± 0.88 | 97.64 ± 0.94 | 97.71 ± 0.90 | 97.65 ± 0.93 |
Comparison of different cases of transfer learning (Unit: %).
| Sensitivity | Specificity | Accuracy | Precision | |
|---|---|---|---|---|
| Case A | 97.78 ± 0.88 | 97.64 ± 0.94 | 97.71 ± 0.90 | 97.65 ± 0.93 |
| Case B | 97.56 ± 0.83 | 97.65 ± 0.76 | 97.60 ± 0.79 | 97.67 ± 0.76 |
| Case C | 97.36 ± 1.05 | 97.66 ± 0.8 | 97.51 ± 0.92 | 97.66 ± 0.82 |
| Case D | 97.61 ± 0.63 | 97.54 ± 0.65 | 97.58 ± 0.64 | 97.57 ± 0.65 |
FIGURE 9Error bar.
Comparison to the state of art methods.
| Method | Sensitivity | Specificity | Accuracy |
|---|---|---|---|
| SNP+SLFN+LReLU ( | 93.05 | 93.06 | 93.06 |
| 4-layer SAE ( | 93.20 ± 1.37 | 93.25 ± 1.38 | 93.22 ± 1.37 |
| 7-layer SAE ( | 95.13 ± 0.84 | 93.33 ± 0.84 | 94.23 ± 0.84 |
| CNN + RAP ( | 96.94 | 97.18 | 97.18 |
| CNN ( | 97.29 | 92.23 | 96.05 |
| NBC ( | 74.53 ± 0.96 | 74.51 ± 1.05 | 74.52 ± 1.00 |
| GA-BPNN ( | 72.90 ± 1.38 | 72.89 ± 1.18 | 72.90 ± 1.28 |
| CNN-SP ( | 97.22 | 97.35 | 97.28 |
| Our method | 97.78 ± 0.88 | 97.64 ± 0.94 | 97.71 ± 0.90 |
FIGURE 10Comparison of the state of art methods (Blue means the sensitivity, red means the specificity, and yellow means the accuracy).