Literature DB >> 33330341

Analysis of COVID-19 Infections on a CT Image Using DeepSense Model.

Adil Khadidos1, Alaa O Khadidos2, Srihari Kannan3, Yuvaraj Natarajan4, Sachi Nandan Mohanty5, Georgios Tsaramirsis6.   

Abstract

In this paper, a data mining model on a hybrid deep learning framework is designed to diagnose the medical conditions of patients infected with the coronavirus disease 2019 (COVID-19) virus. The hybrid deep learning model is designed as a combination of convolutional neural network (CNN) and recurrent neural network (RNN) and named as DeepSense method. It is designed as a series of layers to extract and classify the related features of COVID-19 infections from the lungs. The computerized tomography image is used as an input data, and hence, the classifier is designed to ease the process of classification on learning the multidimensional input data using the Expert Hidden layers. The validation of the model is conducted against the medical image datasets to predict the infections using deep learning classifiers. The results show that the DeepSense classifier offers accuracy in an improved manner than the conventional deep and machine learning classifiers. The proposed method is validated against three different datasets, where the training data are compared with 70%, 80%, and 90% training data. It specifically provides the quality of the diagnostic method adopted for the prediction of COVID-19 infections in a patient.
Copyright © 2020 Khadidos, Khadidos, Kannan, Natarajan, Mohanty and Tsaramirsis.

Entities:  

Keywords:  COVID-19; CT images; DeepSense; artificial intelligence; convolutional neural network; prediction

Mesh:

Year:  2020        PMID: 33330341      PMCID: PMC7714903          DOI: 10.3389/fpubh.2020.599550

Source DB:  PubMed          Journal:  Front Public Health        ISSN: 2296-2565


Introduction

The novel coronavirus disease 2019 (COVID-19) is a pandemic outbreak (1). COVID-19 patients are classified essentially based on computerized tomography (CT) lung images, and it is used widely for testing. The healthcare institutions fitted with CT scans help in the process of image acquisition and classification of CT images at a faster rate. However, the need for an expert medical practitioner is hence required for the verification of the final results, which increases the time of computation (2). On the other hand, the supervised learning models (3–10) can be utilized for classifying the patients from the CT images. Infections based on CT images are not classified using very little unattended methods (11–24). We have developed a model that mainly includes supervised and unsupervised learning models in order to improve the classification process. The aim is to classify the infected patients automatically based on their CT images. In this paper, a DeepSense algorithm is utilized to diagnose COVID-19 infections among the medical community. The deep learning method is designed as a combination of convolutional neural network (CNN) and recurrent neural network (RNN) that reduces the classifier burden on optimal classification of the multidimensional data features. The main contribution of the work includes the following: (a) The authors develop a combined CNN and RNN to classify the medical image datasets. (b) The experimental results are conducted to measure the correctness in terms of its accuracy, precision, and recall values against artificial neural network (ANN), feedforward neural network (FFNN), back propagation neural network (BPNN), deep neural network (DNN), and RNN. The outline of the paper is presented as follows: The Methods section provides the details of the ensemble classifier. The DeepSense Model section evaluates the entire work. The Results and Discussions section concludes the work with future enhancement.

Methods

The deep learning model namely DeepSense algorithm is a combination of CNN and RNN designed to improve the performance of the classification accuracy. DeepSense learning is regarded as a module for accurate predictions of lung infections caused by the COVID-19 virus. Figure 1 shows the architecture of the proposed classification model using the DeepSense algorithm.
Figure 1

Proposed model for classification.

Proposed model for classification.

Deepsense Model

Figure 2 shows the DeepSense DNN (25) model that has three components, including convolutional, recurrent, and output layer that are stacked upon one another. The convolutional and recurrent layers are regarded as the significant building blocks (Figure 1), and the output layer is considered as a specific layer that classifies the images. The DeepSense DNN model is designed for the classification of input CT images for COVID-19-related infections.
Figure 2

Proposed DeepSense deep neural network (DNN) architecture.

Proposed DeepSense deep neural network (DNN) architecture. The DeepSense network avoids gradient exploding and improves the rate of convergence using residual learning, adjustable learning rate, and gradient clipping that helps in optimizing the process of training. The features are extracted through the DeepSense model that increases the reconstruction accuracy and reduces the time of training. Such optimization helps in obtaining the rich text information, and it has better ability for classification.

Convolutional Layers

The convolutional layers have three different parts that include individual convolutional subnets for input from CT device X(k), where k is the number of CT device. The other subnets include a merged convolutional subnet for K convolutional subnets' outputs. For a time interval t, the matrix X(k) is used as an input to the DNN architecture that extracts the relationship of X(k,t), which includes the relationships lying inside the frequency domain. The sensor measurement interactions include entire dimension, where the frequency domain usually has several local patterns. These interactions are studied using 2D filters and produces the output X(k,1,t) based on the local patterns and dimensions in frequency domain. The high-level relationships are learned hierarchically using the application of a 1D filter. The matrix is then flattened into a vector, and they are concatenated to produce the input for RNN layers. The activation function in the convolutional layer is a rectified linear unit (ReLU) function, and batch normalization eliminates the internal covariate shift.

Recurrent Layers

The RNN architecture learns the needed features having long-term dependencies (long paths). The study uses Gated Recurrent Unit (GRU) on long and short path selection to reduce well the network complexity. A set of three layers stacked in GRU is used in this paper that uses time flow that runs the stacked GRU incrementally for faster input data processing. The recurrent layer outputs vector series {x(r,t)} where t = 1,2,···, T for the process of classification at the output layer.

Output Layer

For the purpose of classification, {x(r,t)} is selected as the feature vector, and this layer converts the vector of variable length into fixed length. The final feature is generated by averaging the features over a specific time interval based on long or short paths, x(r) . Finally the probability of predicted category is generated by feeding the averaging features into the softmax layer.

Type-Specific Layer

For the customization of the DeepSense layer to operate the process of classification, we specifically use the following process: Step 1: Identify the input image Step 2: Preprocessing input image for temporal and spectral noise Step 3: Extract the features related to COVID-19 infections Step 4: Apply DeepSense classifier for optimal classifier.

Results and Discussions

This section provides the results of comparison between the machine/deep learning classifiers for predicting COVID-19 infections using IEEE8023 (26), COVID-CT-Dataset (27), and COVID-19 Open Research Dataset Challenge (CORD-19) (28) datasets. IEEE8023 has the image collection from various sources including COVID-19 or viral and bacterial pneumonias in the form of CT images. COVID-CT has 349 COVID-19 CT images from 216 patients and 463 non-COVID-19 CTs. CORD-19 has collected the CT image resources from 52,000 scholarly articles. The study is experimented using a 10-fold cross validation, which is tested with all these three base classifiers.

Experiment

The performance measures for evaluating the DeepSense classifier is estimated against various metrics: accuracy, geometric mean (G-mean), F-measure, precision, percentage error, specificity, and sensitivity. Accuracy for optimal classification is given below: where: TP is defined as the true positive TN is defined as the true negative FP is defined as the false positive FN is defined as the false negative F-measure of the DeepSense classifier is defined as follows: G-mean of the DeepSense classifier is defined as follows: Mean absolute percentage error (MAPE) of the DeepSense classifier is defined as follows: Where A is defined as the actual class F is defined as the predicted class, and n is defined as the fitted points Sensitivity of the DeepSense is defined as: Specificity of the DeepSense is defined as:

Analysis

In this section, we provide the results of various meta-ensemble classifiers that include FFNN (29), ANN (25), DNN (30), BPNN (31), and RNN (32). The proposed method is validated against three different datasets, where the training data are compared with 70% (Figure 3), 80% (Figure 4), and 90% (Figure 5) training data.
Figure 3

Results of classification accuracy during training with 70% training data.

Figure 4

Results of classification accuracy during training with 80% training data.

Figure 5

Results of classification accuracy during training with 90% training data.

Results of classification accuracy during training with 70% training data. Results of classification accuracy during training with 80% training data. Results of classification accuracy during training with 90% training data. Figure 3 shows the results of classification accuracy of CORD-19 datasets for all residuals are higher, and with increasing residuals, the accuracy increases. Same is the case for other training sets; however, with 80% datasets, the accuracy is fluctuating due to the extraction of on-optimal features from IEEE8023 datasets. Tables 1, 4, 7 provide the results of statistical parameters on predicting COVID-19 infections over 70, 80, and 90% training data over IEEE8023 datasets.
Table 1

Results of statistical parameters for IEEE8023 with 70% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy55.6714555.9715258.0619858.3230459.6833580.475
F-measure38.3915940.4920551.7285751.888654.2601383.65671
G-mean72.5402272.7712774.2716174.3116274.7217185.57814
MAPE28.3253325.3836823.9833621.4017920.8216616.1186
Sensitivity61.7448165.2565973.1613685.5481386.2082896.25452
Specificity74.1815974.3716377.8834277.9034279.2747380.11492

ANN, artificial neural network; BPNN, back propagation neural network; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Table 4

Results of statistical parameters for IEEE8023 with 80% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy96.4645797.1847397.2147497.2947697.3047697.43479
F-measure52.3687169.7285970.0496672.9313176.1630379.36475
G-mean81.8863182.736584.3578685.9182190.9613492.48168
MAPE26.9050225.5137122.7420920.0704910.6053790.12115
Sensitivity68.6983670.0896772.8612975.5418984.9980188.59981
Specificity96.5345997.3247697.5248197.6048397.6248397.68484

ANN, artificial neural network; BPNN, back propagation neural network; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Table 7

Results of statistical parameters for IEEE8023 with 90% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy95.9144595.9344595.9444696.0344896.0544896.11449
F-measure77.383377.5133378.0334579.1146979.7948480.08491
G-mean79.4347679.6748279.9448880.945181.2651781.45622
MAPE31.1469730.7868830.2667728.6554128.155327.83522
Sensitivity64.4564164.8164965.3366166.9479767.4480867.76815
Specificity94.7231894.783294.823296.0544896.4645796.81465

ANN, artificial neural network; BPNN, back propagation neural network; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Results of statistical parameters for IEEE8023 with 70% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Tables 2, 5, 8 provide the results of statistical parameters on predicting COVID-19 infections over 70, 80, and 90% training data over COVID-CT datasets.
Table 2

Results of statistical parameters for COVID-CT with 70% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy56.2615858.8731761.2346962.60565.8467284.68794
F-measure66.7469366.7969467.6881468.8083973.8415179.40476
G-mean43.5037356.4316259.463344.70576.0930285.98823
MAPE19.3703316.6987316.6087111.7556310.425339.275074
Sensitivity76.2330578.9146579.0046783.8467585.1880586.33831
Specificity73.3914176.2730677.2132780.3749782.3764284.46789

ANN, artificial neural network; BPNN, back propagation neural network; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Table 5

Results of statistical parameters for COVID-CT with 80% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy97.7348697.7548697.7748697.7748697.7848797.78487
F-measure89.1999590.6312790.781391.2914191.5014692.1416
G-mean93.2728696.6746297.2247497.5248197.6648497.66484
MAPE86.6483827.0150420.250539.27507454.6302221.0017
Sensitivity88.9498995.5833796.6846297.2647597.5448197.55482
Specificity96.7646496.7846496.7846496.7846496.7846497.42479

ANN, artificial neural network; BPNN, back propagation neural network; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Table 8

Results of statistical parameters for COVID-CT with 90% training data on 1,000 images classifier.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy97.3747897.3747897.4547997.4547997.474897.52481
F-measure85.8882186.0082387.9496787.9796789.3399889.34998
G-mean94.0330394.0330394.4031194.4731394.803294.84321
MAPE70.7998370.6197962.7250361.3947354.0400853.35993
Sensitivity90.5312490.5312491.3414391.4714692.2116292.28164
Specificity97.464897.474897.5648297.5648297.6548497.65484

ANN, artificial neural network; BPNN, back propagation neural network; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Results of statistical parameters for COVID-CT with 70% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Tables 3, 6, 9 provide the results of statistical parameters on predicting COVID-19 infections over 70, 80, and 90% training data over CORD-19 datasets.
Table 3

Results of statistical parameters for CORD-19 with 70% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy59.0732165.8567368.878474.0915777.9334382.41643
F-measure69.6885869.9396470.1096870.2897274.8517480.39498
G-mean69.9896570.219771.9400974.0015576.463179.21471
MAPE68.1182364.4764257.7519139.6318636.7702234.91881
Sensitivity77.5233471.1499171.8700773.6414773.8415180.69505
Specificity70.3997472.2801675.3518580.5850281.8963182.30641

ANN, artificial neural network; BPNN, back propagation neural network; CORD-19, COVID-19 Open Research Dataset Challenge; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Table 6

Results of statistical parameters for CORD-19 with 80% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy93.9730194.1230594.2030794.2530894.4331294.44312
F-measure58.2530360.0434360.383560.8436162.3749562.51498
G-mean79.144779.6548180.1849380.4149881.5262381.98633
MAPE29.9066929.1755228.2853327.9352526.1038425.28365
Sensitivity65.6966966.4268567.3180567.6781369.4985470.32973
Specificity95.253395.4133495.4233495.4733595.5133695.56337

ANN, artificial neural network; BPNN, back propagation neural network; CORD-19, COVID-19 Open Research Dataset Challenge; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Table 9

Results of statistical parameters for CORD-19 with 90% training data on 1,000 images.

Statistical parametersANNFFNNBPNNDNNRNNDeepSense
Accuracy97.4437997.4437997.5238197.5238197.5438197.59382
F-measure85.9492286.0692588.0116888.0416989.4029989.41399
G-mean94.0990494.0990494.4701394.5401494.8702294.91022
MAPE70.8498470.669862.7700461.4387454.0790953.39794
Sensitivity90.5952690.5952691.4064491.5364792.2766492.34765
Specificity97.5338197.5438197.6338397.6338397.7238597.72385

ANN, artificial neural network; BPNN, back propagation neural network; CORD-19, COVID-19 Open Research Dataset Challenge; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Results of statistical parameters for CORD-19 with 70% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; CORD-19, COVID-19 Open Research Dataset Challenge; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Results of statistical parameters for IEEE8023 with 80% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Results of statistical parameters for COVID-CT with 80% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Results of statistical parameters for CORD-19 with 80% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; CORD-19, COVID-19 Open Research Dataset Challenge; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Results of statistical parameters for IEEE8023 with 90% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Results of statistical parameters for COVID-CT with 90% training data on 1,000 images classifier. ANN, artificial neural network; BPNN, back propagation neural network; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network. Results of statistical parameters for CORD-19 with 90% training data on 1,000 images. ANN, artificial neural network; BPNN, back propagation neural network; CORD-19, COVID-19 Open Research Dataset Challenge; COVID-19, coronavirus disease 2019; DNN, deep neural network; FFNN, feedforward neural network; MAPE, mean absolute percentage error; RNN, recurrent neural network.

Evaluation Criteria

The simulation results show that the DeepSense classifier has higher classification accuracy than the existing meta-ensemble classifiers. In addition, the CORD-19 datasets offer optimum selection of features to increase classification accuracy by 90% training data over 80 or 90%. The other measurements are optimal for CORD-19 than the other selection tools. Furthermore, MAPE is less than the other methods in the deep learning model. The result shows that the CORD-19 datasets are more accurate than RNN and DNN. The results also show that the classification accuracy with IEEE8023 as a functional selection tool decreases at some point as the number of residues increases compared to COVID-CT and CORD-19. The class of infections is therefore accurately determined with the proposed classification.

Conclusions and Future Work

In this paper, a DeepSense algorithm is designed for the classification of COVID-19 infections. The DeepSense algorithm helps in optimal classification of multidimensional features from CT images. The classifier combined with hybrid deep learning classifier, namely, CNN and RNN, helps in improving the prediction of events from a medical image. The extraction of optimal features from the feature extraction model helps the classifier to optimally detect whether the patient is infected or not. The experimental results show that the proposed method has higher accuracy than the other methods. In the future, the model can be designed with an ensemble data model to classify the highly rated multidimensional dataset.

Data Availability Statement

The original contributions presented in the study are included in the article/supplementary materials, further inquiries can be directed to the corresponding author/s.

Author Contributions

AK: visualization and investigation. AOK: data curation, software, and validation. SK: methodology, data curation, review and editing, and supervision. YN: conceptualization, methodology, writing original draft, software, and data curation. SM: software and validation. GT: writing—review and editing and supervision.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
  11 in total

1.  A Recurrent CNN for Automatic Detection and Classification of Coronary Artery Plaque and Stenosis in Coronary CT Angiography.

Authors:  Majd Zreik; Robbert W van Hamersvelt; Jelmer M Wolterink; Tim Leiner; Max A Viergever; Ivana Isgum
Journal:  IEEE Trans Med Imaging       Date:  2018-11-28       Impact factor: 10.048

2.  Knowledge-based Collaborative Deep Learning for Benign-Malignant Lung Nodule Classification on Chest CT.

Authors:  Yutong Xie; Yong Xia; Jianpeng Zhang; Yang Song; Dagan Feng; Michael Fulham; Weidong Cai
Journal:  IEEE Trans Med Imaging       Date:  2018-10-17       Impact factor: 10.048

3.  Deep Transfer Learning Based Classification Model for COVID-19 Disease.

Authors:  Y Pathak; P K Shukla; A Tiwari; S Stalin; S Singh; P K Shukla
Journal:  Ing Rech Biomed       Date:  2020-05-20

4.  A Weakly-Supervised Framework for COVID-19 Classification and Lesion Localization From Chest CT.

Authors:  Xinggang Wang; Xianbo Deng; Qing Fu; Qiang Zhou; Jiapei Feng; Hui Ma; Wenyu Liu; Chuansheng Zheng
Journal:  IEEE Trans Med Imaging       Date:  2020-08       Impact factor: 10.048

5.  Pulmonary Artery-Vein Classification in CT Images Using Deep Learning.

Authors:  Pietro Nardelli; Daniel Jimenez-Carretero; David Bermejo-Pelaez; George R Washko; Farbod N Rahaghi; Maria J Ledesma-Carbayo; Raul San Jose Estepar
Journal:  IEEE Trans Med Imaging       Date:  2018-05-04       Impact factor: 10.048

6.  FissureNet: A Deep Learning Approach For Pulmonary Fissure Detection in CT Images.

Authors:  Sarah E Gerard; Taylor J Patton; Gary E Christensen; John E Bayouth; Joseph M Reinhardt
Journal:  IEEE Trans Med Imaging       Date:  2018-08-10       Impact factor: 10.048

7.  COVID-19 identification in chest X-ray images on flat and hierarchical classification scenarios.

Authors:  Rodolfo M Pereira; Diego Bertolini; Lucas O Teixeira; Carlos N Silla; Yandre M G Costa
Journal:  Comput Methods Programs Biomed       Date:  2020-05-08       Impact factor: 5.428

8.  Using country-level variables to classify countries according to the number of confirmed COVID-19 cases: An unsupervised machine learning approach.

Authors:  Rodrigo M Carrillo-Larco; Manuel Castillo-Cara
Journal:  Wellcome Open Res       Date:  2020-06-15

9.  Classification of COVID-19 patients from chest CT images using multi-objective differential evolution-based convolutional neural networks.

Authors:  Dilbag Singh; Vijay Kumar; Manjit Kaur
Journal:  Eur J Clin Microbiol Infect Dis       Date:  2020-04-27       Impact factor: 3.267

10.  More Agility to Semantic Similarities Algorithm Implementations.

Authors:  Kostandinos Tsaramirsis; Georgios Tsaramirsis; Fazal Qudus Khan; Awais Ahmad; Alaa Omar Khadidos; Adil Khadidos
Journal:  Int J Environ Res Public Health       Date:  2019-12-30       Impact factor: 3.390

View more
  4 in total

1.  COVID-19 Isolation Control Proposal via UAV and UGV for Crowded Indoor Environments: Assistive Robots in the Shopping Malls.

Authors:  Muhammet Fatih Aslan; Khairunnisa Hasikin; Abdullah Yusefi; Akif Durdu; Kadir Sabanci; Muhammad Mokhzaini Azizan
Journal:  Front Public Health       Date:  2022-05-31

2.  Cognitive computing-based COVID-19 detection on Internet of things-enabled edge computing environment.

Authors:  E Laxmi Lydia; C S S Anupama; A Beno; Mohamed Elhoseny; Mohammad Dahman Alshehri; Mahmoud M Selim
Journal:  Soft comput       Date:  2021-11-18       Impact factor: 3.732

Review 3.  Supervised and weakly supervised deep learning models for COVID-19 CT diagnosis: A systematic review.

Authors:  Haseeb Hassan; Zhaoyu Ren; Chengmin Zhou; Muazzam A Khan; Yi Pan; Jian Zhao; Bingding Huang
Journal:  Comput Methods Programs Biomed       Date:  2022-03-05       Impact factor: 7.027

4.  A Deep Learning Method to Forecast COVID-19 Outbreak.

Authors:  Satyabrata Dash; Sujata Chakravarty; Sachi Nandan Mohanty; Chinmaya Ranjan Pattanaik; Sarika Jain
Journal:  New Gener Comput       Date:  2021-07-18       Impact factor: 1.048

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.