| Literature DB >> 32149106 |
Lal Hussain1, Imtiaz Ahmed Awan1, Wajid Aziz1,2, Sharjil Saeed1, Amjad Ali3, Farukh Zeeshan3, Kyung Sup Kwak4.
Abstract
The adaptability of heart to external and internal stimuli is reflected by the heart rate variability (HRV). Reduced HRV can be a predictor of negative cardiovascular outcomes. Based on the nonlinear, nonstationary, and highly complex dynamics of the controlling mechanism of the cardiovascular system, linear HRV measures have limited capability to accurately analyze the underlying dynamics. In this study, we propose an automated system to analyze HRV signals by extracting multimodal features to capture temporal, spectral, and complex dynamics. Robust machine learning techniques, such as support vector machine (SVM) with its kernel (linear, Gaussian, radial base function, and polynomial), decision tree (DT), k-nearest neighbor (KNN), and ensemble classifiers, were employed to evaluate the detection performance. Performance was evaluated in terms of specificity, sensitivity, positive predictive value (PPV), negative predictive value (NPV), and area under the receiver operating characteristic curve (AUC). The highest performance was obtained using SVM linear kernel (TA = 93.1%, AUC = 0.97, 95% CI [lower bound = 0.04, upper bound = 0.89]), followed by ensemble subspace discriminant (TA = 91.4%, AUC = 0.96, 95% CI [lower bound 0.07, upper bound = 0.81]) and SVM medium Gaussian kernel (TA = 90.5%, AUC = 0.95, 95% CI [lower bound = 0.07, upper bound = 0.86]). The results reveal that the proposed approach can provide an effective and computationally efficient tool for automatic detection of congestive heart failure patients.Entities:
Mesh:
Year: 2020 PMID: 32149106 PMCID: PMC7049402 DOI: 10.1155/2020/4281243
Source DB: PubMed Journal: Biomed Res Int Impact factor: 3.411
Figure 1Schematic diagram for the classification of NSR and CHF subjects.
Figure 2Computation of wavelet entropy.
Figure 3(a) Error on margin using slack variable, (b) SVM nonlinear separation.
Figure 4Decision tree split decision.
CHF detection performance based on time domain features by applying machine learning techniques.
| Classifier | TPR (%) | TNR (%) | PPV (%) | NPV (%) | TA (%) | AUC | LB | UP |
|---|---|---|---|---|---|---|---|---|
|
| ||||||||
| Fine | 78 | 77 | 68 | 85 | 77.6 | 0.73 | 0.22 | 0.77 |
| Coarse | 89 | 66 | 78 | 81 | 80.2 | 0.75 | 0.11 | 0.66 |
|
| ||||||||
|
| ||||||||
| Linear | 90 | 73 | 82 | 84 | 83.6 | 0.92 | 0.10 | 0.73 |
| Quadratic | 88 | 66 | 76 | 81 | 79.3 | 0.84 | 0.13 | 0.66 |
| Cubic | 85 | 70 | 74 | 82 | 79.3 | 0.88 | 0.15 | 0.70 |
| Med. Gaussian | 89 | 73 | 80 | 84 | 82.8 | 0.90 | 0.11 | 0.73 |
|
| ||||||||
|
| ||||||||
| Fine | 79 | 59 | 63 | 76 | 71.6 | 0.69 | 0.21 | 0.59 |
| Medium | 88 | 68 | 77 | 82 | 80.2 | 0.87 | 0.15 | 0.75 |
| Cosine | 82 | 80 | 73 | 87 | 81.0 | 0.83 | 0.18 | 0.80 |
|
| ||||||||
|
| ||||||||
| Bagged tree | 85 | 75 | 75 | 85 | 81.0 | 0.87 | 0.15 | 0.75 |
| Subsp. disc. | 96 | 66 | 91 | 82 | 84.5 | 0.91 | 0.04 | 0.66 |
| RUSBoosted tree | 76 | 68 | 64 | 80 | 73.3 | 0.81 | 0.24 | 0.68 |
CHF detection performance based on frequency domain features by applying machine learning techniques.
| Classifier | TPR (%) | TNR (%) | PPV (%) | NPV (%) | TA (%) | AUC | LB | UP |
|---|---|---|---|---|---|---|---|---|
|
| ||||||||
| Fine | 83 | 75 | 73 | 85 | 80.2 | 0.84 | 0.17 | 0.75 |
| Coarse | 93 | 64 | 85 | 81 | 81.9 | 0.81 | 0.07 | 0.64 |
|
| ||||||||
|
| ||||||||
| Linear | 82 | 77 | 72 | 86 | 80.2 | 0.86 | 0.18 | 0.77 |
| Quadratic | 82 | 82 | 73 | 88 | 81.9 | 0.88 | 0.18 | 0.82 |
| Cubic | 88 | 68 | 77 | 82 | 80.2 | 0.83 | 0.13 | 0.58 |
| Med. Gaussian | 90 | 77 | 83 | 87 | 85.3 | 0.90 | 0.16 | 0.77 |
|
| ||||||||
|
| ||||||||
| Fine | 85 | 75 | 75 | 85 | 81.0 | 0.86 | 0.15 | 0.75 |
| Medium | 89 | 66 | 78 | 81 | 80.2 | 0.88 | 0.11 | 0.66 |
| Cosine | 64 | 73 | 55 | 79 | 67.2 | 0.75 | 0.36 | 0.73 |
|
| ||||||||
|
| ||||||||
| Bagged tree | 85 | 77 | 76 | 86 | 81.9 | 0.88 | 0.15 | 0.77 |
| Subsp. disc. | 86 | 70 | 76 | 83 | 80.2 | 0.85 | 0.14 | 0.70 |
| RUSBoosted tree | 79 | 75 | 69 | 84 | 77.6 | 0.81 | 0.21 | 0.75 |
CHF detection performance based on statistical features by applying machine learning techniques.
| Classifier | TPR (%) | TNR (%) | PPV (%) | NPV (%) | TA (%) | AUC | LB | UP |
|---|---|---|---|---|---|---|---|---|
|
| ||||||||
| Fine | 81 | 68 | 68 | 81 | 75.9 | 0.77 | 0.19 | 0.68 |
| Coarse | 86 | 64 | 74 | 79 | 77.6 | 0.80 | 0.14 | 0.64 |
|
| ||||||||
|
| ||||||||
| Linear | 99 | 55 | 96 | 78 | 81.9 | 0.80 | 0.01 | 0.55 |
| Quadratic | 92 | 64 | 82 | 80 | 81.0 | 0.84 | 0.08 | 0.64 |
| Cubic | 85 | 61 | 71 | 78 | 75.9 | 0.78 | 0.15 | 0.61 |
| Med. Gaussian | 90 | 52 | 77 | 76 | 75.9 | 0.81 | 0.10 | 0.52 |
|
| ||||||||
|
| ||||||||
| Fine | 78 | 55 | 60 | 74 | 69.0 | 0.66 | 0.22 | 0.56 |
| Medium | 89 | 43 | 70 | 72 | 71.6 | 0.78 | 0.11 | 0.43 |
| Cosine | 89 | 48 | 72 | 74 | 73.3 | 0.78 | 0.11 | 0.48 |
|
| ||||||||
|
| ||||||||
| Bagged tree | 85 | 66 | 73 | 80 | 77.6 | 0.81 | 0.15 | 0.66 |
| Subsp. disc. | 99 | 34 | 94 | 71 | 74.1 | 0.77 | 0.01 | 0.34 |
| RUSBoosted tree | 85 | 66 | 73 | 80 | 77.6 | 0.79 | 0.15 | 0.66 |
CHF detection performance based on entropy-based features by applying machine learning techniques.
| Classifier | TPR (%) | TNR (%) | PPV (%) | NPV (%) | TA (%) | AUC | LB | UP |
|---|---|---|---|---|---|---|---|---|
|
| ||||||||
| Fine | 71 | 50 | 51 | 70 | 62.9 | 0.65 | 0.29 | 0.50 |
| Coarse | 90 | 36 | 70 | 70 | 69.8 | 0.69 | 0.10 | 0.36 |
|
| ||||||||
|
| ||||||||
| Linear | 93 | 30 | 72 | 68 | 69.0 | 0.71 | 0.07 | 0.30 |
| Quadratic | 83 | 57 | 68 | 76 | 73.3 | 0.74 | 0.17 | 0.57 |
| Cubic | 82 | 52 | 64 | 74 | 70.7 | 0.73 | 0.18 | 0.52 |
| Med. Gaussian | 94 | 30 | 76 | 69 | 69.8 | 0.75 | 0.06 | 0.30 |
|
| ||||||||
|
| ||||||||
| Fine | 75 | 57 | 58 | 74 | 68.1 | 0.66 | 0.25 | 0.57 |
| Medium | 85 | 50 | 67 | 73 | 71.6 | 0.69 | 0.15 | 0.50 |
| Cosine | 82 | 52 | 64 | 74 | 70.7 | 0.72 | 0.18 | 0.52 |
|
| ||||||||
|
| ||||||||
| Bagged tree | 82 | 57 | 66 | 76 | 72.4 | 0.78 | 0.18 | 0.57 |
| Subsp. disc. | 89 | 39 | 68 | 70 | 69.8 | 0.71 | 0.11 | 0.39 |
| RUSBoosted tree | 75 | 59 | 59 | 75 | 69.0 | 0.75 | 0.25 | 0.59 |
Figure 5Heart failure rate detection performance using decision tree and KNN methods.
Figure 6Heart failure rate detection performance using SVM and ensemble methods.
Figure 7ROC to detect heart failure using Naïve Bayes, decision tree, and SVM with its kernels.
Figure 8Model prediction to detect heart failure using SVM linear classifier based on multimodal features.
Figure 9Model prediction to detect heart failure using SVM quadratic classifier based on multimodal features.
Features-based significance level to distinguish the CHF and NSR subjects.
| Feature | CHF | NSR |
|
|---|---|---|---|
| Mean ± std | Mean ± std | ||
| SDANN | 0.010 ± 0.015 | 0.018 ± 0.008 |
|
| SDNN | 0.066 ± 0.032 | 0.086 ± 0.026 |
|
| SDSD | 0.056 ± 0.045 | 0.028 ± 0.018 |
|
| RMSSD | 0.063 ± 0.050 | 0.035 ± 0.020 |
|
| TP | 347099 ± 316751 | 858649 ± 563951 |
|
| ULF | 80616 ± 80069 | 228361 ± 175578 |
|
| VLF | 178871 ± 166993 | 501651 ± 350518 |
|
| LF | 42353 ± 50725 | 68217 ± 53525 |
|
| HF | 45257 ± 58350 | 60419 ± 55206 |
|
| LFHF | 1.442 ± 0.872 | 1.304 ± 0.389 |
|
| MSEKD | 1.370 ± 0.293 | 1.464 ± 0.179 |
|
| MApEn | 0.004 ± 0.006 | 0.0009 ± 0.003 |
|
| WEShannon | 6594 ± 962 | 6151 ± 1351 |
|
| WELogEn | −17460 ± 5702 | −14830 ± 5722 |
|
| WETh | 19999 ± 0.347 | 19999 ± 0.201 |
|
| WESure | −11085 ± 2692 | −9771 ± 2829 |
|
| WENorm | 12613 ± 2043 | 13594 ± 2070 |
|
| RMS | 0.660 ± 0.096 | 0.708 ± 0.098 |
|
| Var | 0.005 ± 0.005 | 0.008 ± 0.005 |
|
| Smoothness | 0.999 1.08 × 10−5 | 0.999 ± 1.11 × 10−5 |
|
| Kurtosis | 40.2 75.4 | 5.125 ± 7.386 |
|
| Skewness | 1.996 ± 2.672 | 0.264 ± 0.656 |
|
Algorithm comparison of previous studies.
| Author | Title of article | Method | Performance | ||
|---|---|---|---|---|---|
| Li et al. [ | Combining convolutional neural network and distance distribution matrix for identification of congestive heart failure | CNN | TA = 81.9% | ||
|
| |||||
| Isler and Kuntalp [ | Combining classical HRV indices with wavelet entropy measures improves to performance in diagnosing congestive heart failure | KNN | ACC = 81.92% | ||
|
| |||||
| Narin et al. [ | Investigating the performance improvement of HRV indices in CHF using feature selection methods based on backward elimination and statistical significance | SVM | Sens = 79.33% | ||
|
| |||||
| Isler and Kuntalp [ | Heart rate normalization in the analysis of heart rate variability in congestive heart failure | KNN | Sens = 82.72% | ||
|
| |||||
| Pecchia et al. [ | Discrimination power of short-term heart rate variability measures for CHF assessment | CART | Sens = 89.75% | ||
|
| |||||
| Elfadil and Ibrahim [ | Self-organising neural network approach for identification of patients with congestive heart failure | Spectral | ACC = 83.65% | ||
|
| |||||
| Yang et al. [ | A heart failure diagnosis model based on SVM | SVM | TA = 74.42% | ||
|
| |||||
| Chang et al. [ | Decision making model for early diagnosis of CHF using rough set and decision tree approaches | RS | SEN = 97.53% | ||
|
| |||||
| Our method | Extraction of multimodal features to predict congestive heart failure (CHF) | DT | Sens = 82% | ||
| SVM linear | Sens = 96% | ||||
| EnsembleSubspace discriminant | Sens = 93% | ||||