| Literature DB >> 35822170 |
Wael Abdulsalam Hamwi1, Muhammad Mazen Almustafa1.
Abstract
The contagious SARS-CoV-2 has had a tremendous impact on the life and health of many communities. It was first rampant in early 2019 and so far, 539 million cases of COVID-19 have been reported worldwide. This is reminiscent of the 1918 influenza pandemic. However, we can detect the infected cases of COVID-19 by analysing either X-rays or CT, which are presumably considered the least expensive methods. In the existence of state-of-the-art convolutional neural networks (CNNs), which integrate image pre-processing techniques with fully connected layers, we can develop a sophisticated AI system contingent on various pre-trained models. Each pre-trained model we involved in our study assumed its role in extracting some specific features from different chest image datasets in many verified sources, such as (Mendeley, Kaggle, and GitHub). First, for CXR datasets associated with the CNN trained model from the beginning, whereby is comprised of four layers beginning with the Conv2D layer, which comprises 32 filters, followed by the MaxPooling and afterwards, we reiterated similarly. We used two techniques to avoid overgeneralization, the early stopping and the Dropout techniques. After all, the output was one neuron to classify both cases of 0 or 1, followed by a sigmoid function; in addition, we used the Adam optimizer owing to the more improved outcomes than what other optimizers conducted; ultimately, we referred to our findings by using a confusion matrix, classification report (Recall & Precision), sensitivity and specificity; in this approach, we achieved a classification accuracy of 96%. Our three integrated pre-trained models (VGG16, DenseNet201, and DenseNet121) yielded a remarkable test accuracy of 98.81%. Besides, our merged models (VGG16, DenseNet201) trained on CT images with the utmost effort; this model held an accurate test of 99.73% for binary classification with the (Normal/Covid-19) scenario. Comparing our results with related studies shows that our proposed models were superior to the previous CNN machine learning models in terms of various performance metrics. Our pre-trained model associated with the CT dataset achieved 100% of the F1score and the loss value was approximately 0.00268.Entities:
Keywords: AI, Artificial Intelligence; ANNs, Artificilal Neural Networks; Artificial intelligence; CNNs, Convolutional Neural Networks; CT, Computed Tomography; CXR&CT chest COVID-19 images integration of three pre-trained CNN models Fine-tuning; Conv2D, 2D Convolutional Layer; Covid-19, Coronavirus disease of 2019; DL, Deep Learning; Image processing; ML, Machine Learning; Performance evaluation; RT-PCR, Reverse Transcription Polymerase Chain Reaction; ReLU, Rectified Linear Unit; SARS_COV_2, Severe acute respiratory syndrome coronavirus; X-ray,CXR, energic high frequency electromagnetic radiation
Year: 2022 PMID: 35822170 PMCID: PMC9263684 DOI: 10.1016/j.imu.2022.101004
Source DB: PubMed Journal: Inform Med Unlocked ISSN: 2352-9148
Fig. 1The architecture of the proposed-built-model-from-beginning.
Fig. 2The architecture of the integrated transfer-learning models.
Fig. 3Accuracy, and Loss of VGG16, DenseNet201, and DenseNet169 respectively for CT [14].
Fig. 4Accuracy, and Loss of VGG16, DensNet201, and DensNet121 respectively for data in [13] + [46].
Classification report of our models depending on training two datasets.
| Model | Image status | F1-score | Precision | Recall | Support | Overall accuracy | Test loss | Dataset |
|---|---|---|---|---|---|---|---|---|
| Integrated model1 | COVID | 1.00 | 1.00 | 0.99 | 364 | 99.73 | 0.00268 | [ |
| Normal | 1.00 | 0.99 | 1.00 | 381 | ||||
| Integrated model2 | COVID | 0.99 | 1.00 | 0.98 | 698 | 98.81 | 0.01194 | [ |
| Normal | 0.99 | 0.98 | 0.98 | 474 | ||||
| DensNet169 | COVID | 0.97 | 0.99 | 0.96 | 364 | 97.18 | 0.09095 | [ |
| Normal | 0.97 | 0.96 | 0.96 | 381 | ||||
| VGG16 | COVID | 0.98 | 0.99 | 0.98 | 698 | 98.21 | 0.04855 | [ |
| Normal | 0.98 | 0.97 | 0.98 | 474 | ||||
| COVID | 0.98 | 0.98 | 0.97 | 364 | 97.58 | 0.06493 | [ | |
| Normal | 0.98 | 0.97 | 0.98 | 381 | ||||
| DenseNet201 | COVID | 0.98 | 0.98 | 0.98 | 698 | 97.78 | 0.05813 | [ |
| Normal | 0.97 | 0.97 | 0.98 | 474 | ||||
| COVID | 0.97 | 0.97 | 0.97 | 364 | 97.32 | 0.07250 | [ | |
| Normal | 0.97 | 0.97 | 0.97 | 381 | ||||
| VGG19 | COVID | 0.98 | 0.98 | 0.98 | 698 | 97.53 | 0.06570 | [ |
| Normal | 0.97 | 0.97 | 0.97 | 474 | ||||
| COVID | 0.96 | 0.96 | 0.96 | 364 | 96.38 | 0.09129 | [ | |
| Normal | 0.96 | 0.97 | 0.96 | 381 | ||||
| DenseNet121 | COVID | 0.98 | 0.99 | 0.98 | 698 | 97.95 | 0.06453 | [ |
| Normal | 0.97 | 0.97 | 0.98 | 474 | ||||
| COVID | 0.96 | 0.97 | 0.95 | 364 | 95.97 | 0.10428 | [ | |
| Normal | 0.96 | 0.95 | 0.97 | 381 |
Classification report of our models compared to other studies.
| Author | Architecture | Image status | F1-score | Precision | Recall | Overall accuracy | Dataset |
|---|---|---|---|---|---|---|---|
| This Study | Integrated model 1 | COVID | 1.00 | 1.00 | 0.99 | 99.73 | [ |
| Normal | 1.00 | 0.99 | 1.00 | ||||
| [ | VGG16 | COVID | 0.98 | 1.00 | 0.96 | 97.68 | Kaggle 3873 |
| Normal | |||||||
| [ | VGG19 | COVID | – | – | – | 94.52 | [ |
| Normal | – | – | – | ||||
| [ | VGGNet-19 | COVID | 86.5 | 88.5 | 86 | 87 | [ |
| Normal | |||||||
| [ | LeNet-5 | – | 87 | 85 | 89 | 86.06 | [ |
| This Study | Integrated model 2 | COVID | 0.99 | 1.00 | 0.98 | 98.81 | [ [ |
| Normal | 0.99 | 0.98 | 0.99 | ||||
| [ | MobileNetV2 | COVID | 97 | 97 | 98 | 98 | [ |
| Normal | |||||||
| [ | Fusion model | COVID | 95.50 | 96.10 | 96.42 | 96 | Kaggle |
| Normal | |||||||
| [ | VGG16 | COVID | 0.94 | 0.93 | 0.95 | 0.95 | [ |
| Normal | 0.98 | 0.98 | 0.98 | ||||
| [ | MobileNetV2 | COVID | 0.94 | 0.99 | 0.90 | 0.93 | [ |
| Normal | 0.91 | 0.85 | 0.98 |
Fig. 6Confusion matrices VGG16, DenseNet201, DensNet169 respectively for CT data in [14].
Fig. 7Confusion matrices VGG16, DenseNet201, and DenseNet121 respectively for CT data in [13] + [46].
Fig. 8Confusion matrices integrated model 1, and integrated model 2for data in Ref. [14], [13] + [46] respectively.
Fig. 5Shows a depiction of the image processing steps carried out by the neural network.