| Literature DB >> 35928478 |
Nazrul Anuar Nayan1,2, Choon Jie Yi1, Mohd Zubir Suboh1, Nur-Fadhilah Mazlan3, Petrick Periyasamy4, Muhammad Yusuf Zawir Abdul Rahim4, Shamsul Azhar Shah5.
Abstract
At present, COVID-19 is spreading widely around the world. It causes many health problems, namely, respiratory failure and acute respiratory distress syndrome. Wearable devices have gained popularity by allowing remote COVID-19 detection, contact tracing, and monitoring. In this study, the correlation of photoplethysmogram (PPG) morphology between patients with COVID-19 infection and healthy subjects was investigated. Then, machine learning was used to classify the extracted features between 43 cases and 43 control subjects. The PPG data were collected from 86 subjects based on inclusion and exclusion criteria. The systolic-onset amplitude was 3.72% higher for the case group. However, the time interval of systolic-systolic was 7.69% shorter in the case than in control subjects. In addition, 12 out of 20 features exhibited a significant difference. The top three features included dicrotic-systolic time interval, onset-dicrotic amplitude, and systolic-onset time interval. Nine features extracted by heatmap based on the correlation matrix were fed to discriminant analysis, k-nearest neighbor, decision tree, support vector machine, and artificial neural network (ANN). The ANN showed the best performance with 95.45% accuracy, 100% sensitivity, and 90.91% specificity by using six input features. In this study, a COVID-19 prediction model was developed using multiple PPG features extracted using a low-cost pulse oximeter.Entities:
Keywords: COVID-19; diagnostic; machine learning; non-invasive; photoplethysmogram; prediction
Mesh:
Year: 2022 PMID: 35928478 PMCID: PMC9343670 DOI: 10.3389/fpubh.2022.920849
Source DB: PubMed Journal: Front Public Health ISSN: 2296-2565
Figure 1Methodology in developing COVID-19 prediction model by utilizing photoplethysmogram.
Figure 2PPG data taken from the index finger of the right hand during a 10-min resting period.
Figure 3Good and poor quality of PPG signal as defined in blue and red color signal, respectively.
Figure 4PPG signal showing all the fiducial points.
PPG fiducial points and the features.
|
|
|
|---|---|
| Onset | o2o_wt, o2s_wt, o2n_wt, o2d_wt, o2s_hr, o2n_hr, o2d_hr, s2n_hr |
| Systolic | s2s_wt, s2o_wt, s2n_wt, s2d_wt |
| Notch | n2n_wt, n2s_wt, n2o_wt, n2d_wt |
| Diastolic | d2d_wt, d2o_wt, d2s_wt, d2n_wt |
Figure 5Performance evaluators for ML-trained models.
Descriptive characteristics of the subjects (N = 86).
|
|
| |
|---|---|---|
| Age (years), Mean(SD) | 57.93 (13.75) | 58.65 (13.82) |
| Gender | 23 M/20 F | 23 M/20 F |
| Ethnic | 21 Malays | 24 Malays |
| 18 Chinese | 18 Chinese | |
| 2 Indians | 0 Indians | |
| 1 Pakistani | 0 Pakistani | |
| 1 Indonesian | 1 Indonesian | |
Figure 6Comparison of PPG signal between (A) case and (B) control group.
Feature extraction of case and control ranked based on the P value.
|
|
|
|
| |
|---|---|---|---|---|
| 1 |
|
|
| |
| 2 |
|
|
| |
| 3 |
|
|
| |
| 4 |
|
|
| |
| 5 |
|
|
| |
| 6 |
|
|
| |
| 7 |
|
|
| |
| 8 |
|
|
| |
| 9 |
|
|
| |
| 10 |
|
|
| |
| 11 |
|
|
| |
| 12 |
|
|
|
|
| 13 | n2n_wt | 0.59 (0.44) | 0.54 (0.12) | 0.05662 |
| 14 | o2o_wt | 0.59 (0.43) | 0.54 (0.13) | 0.6368 |
| 15 | o2s_hr | 33.14 (10.36) | 31.95 (6.58) | 0.2209 |
| 16 | o2n_wt | 0.30 (0.09) | 0.31 (0.07) | 0.2479 |
| 17 | o2d_wt | 0.28 (0.04) | 0.31 (0.09) | 0.2657 |
| 18 | n2o_wt | 0.33 (0.51) | 0.22 (0.11) | 0.2931 |
| 19 | d2n_wt | 0.26 (0.19) | 0.25 (0.20) | 0.3687 |
| 20 | s2n_wt | 0.15 (0.07) | 0.37 (0.07) | 0.6301 |
The 12 bold items indicates 12 statistically significance features with P-value of less than 0.05 (p < 0.05).
Normality tests of kurtosis and skewness.
|
|
|
|
|
|---|---|---|---|
| 1 |
|
|
|
| 2 |
|
|
|
| 3 | s2n_wt | 12.9306 | 2.6589 |
| 4 |
|
|
|
| 5 | o2o_wt | 35.5860 | 5.4855 |
| 6 | o2s_wt | 32.5283 | 4.8299 |
| 7 | o2n_wt | 3.3604 |
|
| 8 | o2d_wt | 31.3579 | 4.3775 |
| 9 | n2n_wt | 31.0130 | 5.5323 |
| 10 |
|
|
|
| 11 | n2o_wt | 32.7718 | 5.5978 |
| 12 |
|
|
|
| 13 |
|
|
|
| 14 |
|
|
|
| 15 |
|
|
|
| 16 |
|
|
|
| 17 | o2s_hr | 2.2197 |
|
| 18 |
|
|
|
| 19 |
|
|
|
| 20 |
|
|
|
Figure 7Heatmap of the correlation matrix for feature selection.
Figure 8Correlation strength and P value of the selected features.
Figure 9Accuracy of the best-trained model for all ML with increasing input features.
Performance comparison of the best-trained model for all ML.
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| No. of features | 6 | 7 | 4 | 6 | 6 |
| Setting | Type = Linear | k = 7 | Split = 1 | Kernel = Linear | HN1 = 9 HN2 = 13 |
| Val MSE | 0.4167 | 0.3077 | 0.4615 | 0.3333 | 0.1538 |
| Val ACC | 58.33 | 69.23 | 53.85 | 66.67 | 84.62 |
| Test MSE | 0.0909 | 0.0455 | 0.0455 | 0.0909 | 0.0455 |
| Test SP | 90.91 | 100.00 | 90.91 | 90.91 | 90.91 |
| Test SN | 90.91 | 90.91 | 100.00 | 90.91 | 100.00 |
| Test ACC | 90.91 | 95.45 | 95.45 | 90.91 | 95.45 |
| AUC | 0.9174 | 0.9752 | 0.9463 | 0.8926 | 0.9587 |
k, number of neighbors, Val, validation, HN, hidden nodes.
Figure 10ROC curves for the best-trained model of all ML.