| Literature DB >> 35626413 |
Masyitah Abu1, Nik Adilah Hanin Zahri1, Amiza Amir1, Muhammad Izham Ismail2, Azhany Yaakub3, Said Amirul Anwar4, Muhammad Imran Ahmad4.
Abstract
Numerous research have demonstrated that Convolutional Neural Network (CNN) models are capable of classifying visual field (VF) defects with great accuracy. In this study, we evaluated the performance of different pre-trained models (VGG-Net, MobileNet, ResNet, and DenseNet) in classifying VF defects and produced a comprehensive comparative analysis to compare the performance of different CNN models before and after hyperparameter tuning and fine-tuning. Using 32 batch sizes, 50 epochs, and ADAM as the optimizer to optimize weight, bias, and learning rate, VGG-16 obtained the highest accuracy of 97.63 percent, according to experimental findings. Subsequently, Bayesian optimization was utilized to execute automated hyperparameter tuning and automated fine-tuning layers of the pre-trained models to determine the optimal hyperparameter and fine-tuning layer for classifying many VF defect with the highest accuracy. We found that the combination of different hyperparameters and fine-tuning of the pre-trained models significantly impact the performance of deep learning models for this classification task. In addition, we also discovered that the automated selection of optimal hyperparameters and fine-tuning by Bayesian has significantly enhanced the performance of the pre-trained models. The results observed the best performance for the DenseNet-121 model with a validation accuracy of 98.46% and a test accuracy of 99.57% for the tested datasets.Entities:
Keywords: CNN; VF defect; fine-tuning; hyperparameter
Year: 2022 PMID: 35626413 PMCID: PMC9140208 DOI: 10.3390/diagnostics12051258
Source DB: PubMed Journal: Diagnostics (Basel) ISSN: 2075-4418
Figure 1VF Defect Regions [1].
Distribution of VF Defects from Collected Datasets.
| Type of VF Defect | No. of Record |
|---|---|
| Central scotoma | 188 |
| Right/Left hemianopia | 205 |
| Right/left/upper/lower quadrantanopia | 150 |
| Normal | 273 |
| Tunnel vision | 207 |
| Superior/inferior defect field | 177 |
Figure 2Normal VF Images: (a) HVF 10-2; (b) HVF 24-2.
Types of VF Defects.
| Defect Type | VF Image |
|---|---|
| Central scotoma |
|
| Right/left hemianopia |
|
| Right/left/upper/lower quadrantanopia |
|
| Tunnel vision |
|
| Superior/inferior defect field |
|
Figure 3The framework of a comprehensive analysis of automated hyperparameters and automated fine-tuning for pre-trained models.
Figure 4VGG-Net Model.
Figure 5ResNet Model.
Figure 6MobileNet Model.
Figure 7MobileNetV2 Model.
Figure 8DenseNet Model.
Figure 9Bayesian optimization tuning dropout rate for VGG-Net.
Parameters of each pre-trained model.
| Model | Image Size | Parameter | Validation Accuracy (%) | |
|---|---|---|---|---|
| without VF | with VF | |||
| VGG-16 | 224 | 14,714,688 | 14,865,222 | 97.63 |
| 256 | 14,911,302 | 96.55 | ||
| VGG-19 | 224 | 20,024,384 | 20,174,918 | 96.34 |
| 256 | 20,220,998 | 17.69 | ||
| MobileNet | 224 | 3,228,864 | 3,529,926 | 88.79 |
| 256 | 3,622,086 | 94.41 | ||
| MobileNetV2 | 224 | 2,257,984 | 2,634,310 | 70.91 |
| 256 | 2,749,510 | 39.24 | ||
| ResNet50 | 224 | 23,587,712 | 24,189,830 | 90.46 |
| 256 | 24,374,150 | 86.66 | ||
| ResNet101 | 224 | 42,658,176 | 43,260,294 | 95.69 |
| 256 | 43,444,614 | 92.91 | ||
| DenseNet121 | 224 | 7,037,504 | 7,338,566 | 74.72 |
| 256 | 7,430,726 | 94.20 | ||
| DenseNet169 | 224 | 12,642,880 | 13,132,102 | 97.20 |
| 256 | 13,281,862 | 93.27 | ||
Figure 10Validation of pre-trained model accuracy on a 224-image size.
Validation of automated hyperparameter tuning and automated fine-tuning of each pre-trained model.
| Model | Hyperparameter | Fine-Tuned | Validation Accuracy (%) | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Feature Map | Filter Size | Activation Function | Pool Size | Optimizer | Learning Rate | Batch Size | Epoch | Dropout Rate | Upper Layer | Lower Layer | ||
| VGG-16 | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 97.72 |
| 43 | 2 | Sigmoid | 1 | RMSprop | 0.0002 | 29 | 42 | 0.6 | TRUE | TRUE | 20.69 | |
| 52 | 2 | ReLU | 2 | ADAM | 0.0006 | 19 | 54 | 0.7 | FALSE | FALSE | 98 | |
| 48 | 1 | Sigmoid | 1 | RMSprop | 0.0161 | 27 | 92 | 0.2 | TRUE | FALSE | 20.69 | |
| 53 | 2 | Sigmoid | 2 | ADAM | 0.0081 | 15 | 103 | 0.8 | FALSE | TRUE | 17.24 | |
| 52 | 2 | Sigmoid | 2 | Adadelta | 0.0507 | 18 | 69 | 0.3 | TRUE | TRUE | 20.69 | |
| 39 | 3 | Sigmoid | 2 | RMSprop | 0.0513 | 11 | 13 | 0.6 | TRUE | TRUE | 18.1 | |
| 53 | 1 | ReLU | 1 | ADAM | 0.0046 | 9 | 11 | 0.8 | FALSE | FALSE | 20.69 | |
| 55 | 3 | ReLU | 1 | Adadelta | 0.0813 | 31 | 24 | 0.3 | FALSE | TRUE | 93.97 | |
| 34 | 2 | Sigmoid | 2 | RMSprop | 0.0002 | 15 | 66 | 0.8 | TRUE | TRUE | 20.69 | |
| 32 | 2 | ReLU | 1 | SGD | 0.0001 | 1 | 200 | 0.1 | TRUE | FALSE | 95.73 | |
| VGG-19 | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 20.69 |
| 60 | 2 | Sigmoid | 2 | ADAM | 0.0031 | 9 | 144 | 0.3 | TRUE | FALSE | 20.69 | |
| 60 | 2 | Sigmoid | 1 | SGD | 0.0082 | 23 | 169 | 0.6 | TRUE | FALSE | 20.69 | |
| 33 | 2 | ReLU | 2 | ADAM | 0.0004 | 31 | 166 | 0.4 | FALSE | FALSE | 97.84 | |
| 35 | 2 | ReLU | 1 | SGD | 0.0338 | 23 | 57 | 0.2 | TRUE | TRUE | 93.84 | |
| 46 | 1 | Sigmoid | 1 | ADAM | 0.0008 | 19 | 163 | 0.1 | FALSE | FALSE | 20.69 | |
| 54 | 2 | ReLU | 2 | RMSprop | 0.0048 | 30 | 193 | 0.5 | FALSE | FALSE | 20.69 | |
| 40 | 3 | ReLU | 2 | ADAM | 0.0248 | 6 | 101 | 0.7 | TRUE | FALSE | 20.69 | |
| 35 | 2 | Sigmoid | 1 | ADAM | 0.0002 | 30 | 73 | 0.2 | FALSE | FALSE | 20.69 | |
| 40 | 3 | ReLU | 1 | RMSprop | 0.0046 | 31 | 36 | 0.1 | TRUE | TRUE | 20.69 | |
| 41 | 1 | ReLU | 1 | RMSprop | 0.0002 | 13 | 79 | 0.8 | TRUE | FALSE | 53.28 | |
| MobileNet | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 93.58 |
| 45 | 2 | Sigmoid | 2 | SGD | 0.0322 | 15 | 96 | 0.6 | TRUE | FALSE | 18.44 | |
| 60 | 3 | Sigmoid | 1 | RMSprop | 0.0018 | 18 | 138 | 0.3 | FALSE | FALSE | 29.26 | |
| 41 | 2 | Sigmoid | 2 | Adadelta | 0.0003 | 9 | 61 | 0.8 | TRUE | TRUE | 20.69 | |
| 48 | 2 | Sigmoid | 2 | ADAM | 0.0002 | 9 | 135 | 0.4 | TRUE | FALSE | 15.51 | |
| 57 | 1 | ReLU | 2 | Adadelta | 0.0055 | 7 | 106 | 0.2 | TRUE | FALSE | 94.4 | |
| 45 | 2 | ReLU | 2 | Adadelta | 0.001 | 13 | 154 | 0.4 | TRUE | FALSE | 93.62 | |
| 57 | 3 | Sigmoid | 1 | SGD | 0.0001 | 20 | 65 | 0.3 | TRUE | FALSE | 20.69 | |
| 32 | 2 | Sigmoid | 1 | ADAM | 0.0531 | 2 | 45 | 0.9 | FALSE | FALSE | 20.69 | |
| 57 | 1 | ReLU | 2 | RMSprop | 0.0005 | 32 | 21 | 0.4 | TRUE | TRUE | 95.13 | |
| 52 | 2 | Sigmoid | 1 | RMSprop | 0.0319 | 5 | 133 | 0.2 | FALSE | FALSE | 17.16 | |
| MobileNetV2 | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 92.5 |
| 58 | 1 | ReLU | 1 | RMSprop | 0.0053 | 14 | 16 | 0.7 | FALSE | FALSE | 20.82 | |
| 49 | 3 | ReLU | 1 | Adadelta | 0.0009 | 17 | 189 | 0.8 | TRUE | FALSE | 36.42 | |
| 44 | 3 | ReLU | 1 | RMSprop | 0.0003 | 26 | 43 | 0.4 | TRUE | TRUE | 86.47 | |
| 44 | 3 | ReLU | 1 | SGD | 0.01 | 14 | 65 | 0.6 | TRUE | TRUE | 81.77 | |
| 36 | 2 | ReLU | 2 | RMSprop | 0.0001 | 11 | 67 | 0.4 | FALSE | FALSE | 95.6 | |
| 64 | 2 | ReLU | 2 | RMSprop | 0.0134 | 7 | 119 | 0.7 | FALSE | TRUE | 20.69 | |
| 51 | 2 | Sigmoid | 2 | RMSprop | 0.0015 | 24 | 181 | 0.1 | TRUE | FALSE | 74.05 | |
| 51 | 1 | Sigmoid | 1 | RMSprop | 0.0006 | 6 | 168 | 0.5 | TRUE | TRUE | 76.38 | |
| 52 | 2 | ReLU | 2 | SGD | 0.0022 | 14 | 66 | 0.3 | FALSE | FALSE | 52.93 | |
| 46 | 3 | ReLU | 1 | RMSprop | 0.0003 | 21 | 33 | 0.1 | FALSE | TRUE | 84.01 | |
| ResNet-50 | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 97.33 |
| 63 | 3 | ReLU | 2 | RMSprop | 0.0601 | 25 | 101 | 0.4 | TRUE | TRUE | 47.85 | |
| 50 | 3 | ReLU | 1 | RMSprop | 0.0005 | 30 | 174 | 0.8 | FALSE | TRUE | 97.28 | |
| 50 | 1 | Sigmoid | 1 | ADAM | 0.0051 | 16 | 18 | 0.8 | FALSE | FALSE | 21.25 | |
| 54 | 2 | Sigmoid | 1 | ADAM | 0.0048 | 7 | 157 | 0.2 | FALSE | TRUE | 74.18 | |
| 41 | 1 | Sigmoid | 2 | ADAM | 0.0364 | 24 | 129 | 0.3 | FALSE | TRUE | 22.63 | |
| 45 | 1 | ReLU | 1 | RMSprop | 0.0189 | 13 | 12 | 0.3 | FALSE | TRUE | 85.6 | |
| 41 | 2 | Sigmoid | 2 | Adadelta | 0.0142 | 11 | 66 | 0.9 | TRUE | TRUE | 97.46 | |
| 50 | 2 | ReLU | 2 | ADAM | 0.0017 | 23 | 178 | 0.3 | TRUE | TRUE | 95.6 | |
| 46 | 2 | ReLU | 2 | Adadelta | 0.0049 | 25 | 13 | 0.8 | TRUE | FALSE | 75 | |
| 51 | 2 | ReLU | 2 | RMSprop | 0.0076 | 32 | 200 | 0.1 | TRUE | TRUE | 96.25 | |
| ResNet-101 | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 96.07 |
| 42 | 2 | ReLU | 2 | RMSprop | 0.061 | 15 | 110 | 0.6 | FALSE | TRUE | 75.47 | |
| 52 | 2 | Sigmoid | 1 | RMSprop | 0.003 | 32 | 106 | 0.2 | FALSE | TRUE | 19.01 | |
| 61 | 3 | ReLU | 2 | Adadelta | 0.0023 | 20 | 54 | 0.7 | TRUE | FALSE | 93.36 | |
| 45 | 2 | Sigmoid | 2 | ADAM | 0.0001 | 10 | 87 | 0.3 | FALSE | FALSE | 45.13 | |
| 36 | 2 | Sigmoid | 1 | SGD | 0.0029 | 19 | 133 | 0.1 | FALSE | TRUE | 96.67 | |
| 38 | 2 | ReLU | 1 | ADAM | 0.0011 | 12 | 130 | 0.7 | TRUE | FALSE | 96.29 | |
| 44 | 3 | ReLU | 2 | SGD | 0.0002 | 16 | 182 | 0.6 | FALSE | TRUE | 96.8 | |
| 58 | 2 | Sigmoid | 2 | SGD | 0.0001 | 18 | 32 | 0.4 | FALSE | FALSE | 96.77 | |
| 32 | 2 | Sigmoid | 2 | Adadelta | 0.1 | 18 | 66 | 0.9 | TRUE | FALSE | 93.92 | |
| 36 | 1 | ReLU | 2 | ADAM | 0.0016 | 7 | 113 | 0.1 | TRUE | FALSE | 96.94 | |
| DenseNet-121 | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 97.96 |
| 63 | 1 | Sigmoid | 1 | ADAM | 0.0329 | 24 | 61 | 0.1 | TRUE | TRUE | 71.38 | |
| 38 | 2 | Sigmoid | 2 | RMSprop | 0.0687 | 22 | 38 | 0.8 | FALSE | FALSE | 83.58 | |
| 44 | 1 | ReLU | 2 | SGD | 0.0324 | 9 | 167 | 0.7 | FALSE | TRUE | 97.89 | |
| 41 | 2 | Sigmoid | 2 | ADAM | 0.0003 | 15 | 67 | 0.6 | FALSE | FALSE | 76.59 | |
| 51 | 3 | ReLU | 2 | Adadelta | 0.0091 | 17 | 195 | 0.8 | FALSE | FALSE | 98.45 | |
| 49 | 1 | ReLU | 1 | Adadelta | 0.0333 | 11 | 85 | 0.7 | FALSE | TRUE | 98.06 | |
| 60 | 2 | ReLU | 1 | ADAM | 0.0217 | 18 | 111 | 0.1 | TRUE | FALSE | 89.44 | |
| 34 | 1 | Sigmoid | 2 | SGD | 0.0024 | 10 | 136 | 0.4 | TRUE | FALSE | 82.46 | |
| 47 | 2 | Sigmoid | 1 | RMSprop | 0.01 | 14 | 67 | 0.7 | TRUE | FALSE | 95.39 | |
| 55 | 2 | Sigmoid | 2 | ADAM | 0.0044 | 18 | 142 | 0.7 | TRUE | FALSE | 76.72 | |
| DenseNet-169 | 64 | 3 | ReLU | 2 | ADAM | 0.001 | 32 | 200 | 0.2 | FALSE | FALSE | 94.27 |
| 52 | 2 | ReLU | 1 | ADAM | 0.0183 | 25 | 52 | 0.5 | FALSE | FALSE | 87.93 | |
| 40 | 2 | Sigmoid | 1 | RMSprop | 0.0108 | 14 | 69 | 0.8 | TRUE | TRUE | 96.29 | |
| 38 | 2 | ReLU | 1 | Adadelta | 0.0022 | 24 | 143 | 0.3 | FALSE | FALSE | 98.43 | |
| 52 | 2 | ReLU | 2 | ADAM | 0.0005 | 11 | 188 | 0.3 | TRUE | TRUE | 97.76 | |
| 43 | 3 | ReLU | 1 | ADAM | 0.0013 | 24 | 76 | 0.2 | TRUE | TRUE | 96.85 | |
| 42 | 1 | ReLU | 2 | ADAM | 0.0004 | 19 | 156 | 0.6 | FALSE | TRUE | 97.93 | |
| 54 | 2 | ReLU | 1 | RMSprop | 0.0049 | 4 | 176 | 0.8 | FALSE | FALSE | 91.98 | |
| 45 | 3 | Sigmoid | 2 | RMSprop | 0.0003 | 25 | 20 | 0.5 | FALSE | FALSE | 52.84 | |
| 49 | 1 | Sigmoid | 1 | SGD | 0.0061 | 2 | 178 | 0.4 | FALSE | FALSE | 96.9 | |
| 35 | 3 | ReLU | 1 | ADAM | 0.0621 | 19 | 109 | 0.4 | TRUE | FALSE | 16.72 | |
Figure 11Validation of pre-trained models when some layers were frozen: (a) Freeze all layers (the upper and lower layers of the model are FALSE); (b) Freeze upper layers of the model (the upper layers are FALSE and the lower layers are TRUE); (c) Freeze lower layers of the model (the upper layers are TRUE and the lower layers are FALSE; (d) Unfreeze all layers of the model (the upper and lower layers are TRUE.
Comparison of the testing results of automated hyperparameter tuning and automated fine-tuning for the pre-trained models.
| Method | Precision (%) | Recall (%) | F1 (%) | Accuracy (%) | Loss |
|---|---|---|---|---|---|
| VGG-16 | 97.66 | 97.66 | 97.50 | 98.28 | 0.0760 |
| VGG-19 | 96.66 | 96.83 | 96.66 | 97.84 | 0.1701 |
| MobileNet | 92.00 | 93.83 | 91.50 | 92.45 | 0.3170 |
| MobileNetV2 | 97.66 | 97.93 | 97.66 | 97.84 | 0.3087 |
| ResNet-50 | 97.33 | 97.83 | 97.33 | 97.41 | 0.0792 |
| ResNet-101 | 96.66 | 96.33 | 96.33 | 96.55 | 0.1346 |
| DenseNet-121 | 99.83 | 99.83 | 99.66 | 99.57 | 0.0048 |
| DenseNet-169 | 98.83 | 98.83 | 98.66 | 98.92 | 0.0774 |
Figure 12Confusion matrices for VGG-Net, MobileNet, ResNet, and DenseNet: (a) Confusion matrix for VGG-16; (b) Confusion matrix for VGG-19; (c) Confusion matrix for MobileNet; (d) Confusion matrix for MobileNetV2; (e) Confusion matrix forf ResNet-50; (f) Confusion matrix for ResNet-101; (g) Confusion matrix for DenseNet-121; (h) Confusion matrix for DenseNet-169.
Figure 13Feature learn by DenseNet-121 before and after Bayesian: (a) Before applying Bayesian optimization; (b) After applying Bayesian optimization.