| Literature DB >> 35271097 |
Albatul Albattah1, Murad A Rassam1,2.
Abstract
As the Internet of Healthcare Things (IoHT) concept emerges today, Wireless Body Area Networks (WBAN) constitute one of the most prominent technologies for improving healthcare services. WBANs are made up of tiny devices that can effectively enhance patient quality of life by collecting and monitoring physiological data and sending it to healthcare givers to assess the criticality of a patient and act accordingly. The collected data must be reliable and correct, and represent the real context to facilitate right and prompt decisions by healthcare personnel. Anomaly detection becomes a field of interest to ensure the reliability of collected data by detecting malicious data patterns that result due to various reasons such as sensor faults, error readings and possible malicious activities. Various anomaly detection solutions have been proposed for WBAN. However, existing detection approaches, which are mostly based on statistical and machine learning techniques, become ineffective in dealing with big data streams and novel context anomalous patterns in WBAN. Therefore, this paper proposed a model that employs the correlations that exist in the different physiological data attributes with the ability of the hybrid Convolutional Long Short-Term Memory (ConvLSTM) techniques to detect both simple point anomalies as well as contextual anomalies in the big data stream of WBAN. Experimental evaluations revealed that an average of 98% of F1-measure and 99% accuracy were reported by the proposed model on different subjects of the datasets compared to 64% achieved by both CNN and LSTM separately.Entities:
Keywords: anomaly detection; convolutional neural networks; deep learning; long short-term memory; spatiotemporal correlation; wireless body area networks
Mesh:
Year: 2022 PMID: 35271097 PMCID: PMC8915085 DOI: 10.3390/s22051951
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Analysis of Existing Studies.
| Study | Point Anomaly | Contextual Anomaly | Correlation | Thresholds | Faulty Measurements | ||
|---|---|---|---|---|---|---|---|
| Temporal | Spatial | Static | Dynamic | ||||
| [ | ✓ | ✓ | X | X | ✓ | X | ✓ |
| [ | ✓ | X | X | X | X | X | ✓ |
| [ | ✓ | ✓ | ✓ | ✓ | X | ✓ | ✓ |
| [ | ✓ | ✓ | X | ✓ | X | ✓ | ✓ |
| [ | ✓ | ✓ | X | ✓ | X | ✓ | ✓ |
| [ | ✓ | X | ✓ | ✓ | ✓ | X | ✓ |
| [ | ✓ | X | ✓ | X | ✓ | X | ✓ |
| [ | ✓ | X | X | X | X | X | ✓ |
| [ | ✓ | X | ✓ | X | ✓ | X | ✓ |
| [ | ✓ | X | X | ✓ | ✓ | ✓ | ✓ |
| [ | ✓ | ✓ | X | X | X | ✓ | ✓ |
| [ | ✓ | ✓ | ✓ | ✓ | X | ✓ | ✓ |
| [ | ✓ | ✓ | ✓ | ✓ | X | ✓ | ✓ |
| [ | ✓ | X | X | X | X | X | ✓ |
| [ | ✓ | X | ✓ | X | X | ✓ | ✓ |
| [ | ✓ | X | X | ✓ | X | ✓ | ✓ |
| [ | ✓ | X | ✓ | ✓ | ✓ | X | ✓ |
| [ | ✓ | X | ✓ | X | ✓ | X | ✓ |
| [ | ✓ | ✓ | ✓ | ✓ | X | ✓ | ✓ |
| [ | ✓ | X | ✓ | X | X | ✓ | ✓ |
Figure 1Proposed Mode.
Sample of sensor readings for Subject 1 dataset.
|
| HR | ABPSys | ABPDias | ABPMean | PULSE | RESP | SpO2 |
|---|---|---|---|---|---|---|---|
|
| 77.6 | 157.4 | 66.1 | 100.5 | 77.9 | 23 | 97.4 |
|
| 77.3 | 149.2 | 62.6 | 95 | 77.6 | 22.2 | 97 |
|
| 76.1 | 150.5 | 62.4 | 95.1 | 76.8 | 22.3 | 97 |
|
| 73 | 158.4 | 65.4 | 99.8 | 74.3 | 22.2 | 97.4 |
|
| 75.6 | 152.4 | 63.3 | 96.7 | 76.4 | 22.4 | 97.5 |
|
| 75 | 154.3 | 63.4 | 97.1 | 75.4 | 22.2 | 97.5 |
|
| 75.2 | 150.3 | 62.1 | 94.7 | 76.7 | 22.1 | 97.6 |
Figure 2Sensors readings for Data Subject 1.
Figure 3LSTM cell architecture.
Definition and description of the variables used in the LSTM model [32].
| Variable | Definition and Description |
|---|---|
|
| the weights matrices of forget gate ( |
|
| The output from the cell at time |
|
| the current input at time |
|
| The bias in the forget gate |
|
| the weights matrices of input gate ( |
|
| The bias in the input gate |
|
| the weights matrices of output gate ( |
|
| The bias in the output gate |
|
| The cell state at time |
|
| The data stored in the new cell state |
|
| the weights matrices of the new cell state |
|
| The bias in new cell state |
|
| The hidden states at sequential time |
Figure 4CNN-LSTM (ConvLSTM) Architecture.
Definition and description of the variables used in Conv-LSTM model [36].
| Variable | Definition and Description |
|---|---|
|
| The input tensor at time |
|
| The output tensor from the cell at time |
|
| The output tensor from the cell at time |
|
| The cell state at time |
|
| The data that stored in the new cell state |
|
| The cell state at time |
|
| Output of the forget gate and it controls the data that is forgotten in the old cell state |
|
| Output of the input gate, it controls how much of the data |
|
| Output of the output gate, it controls the data that is output |
|
| The convolution kernel used to the input tensor |
|
| The convolution kernel used to the input tensor |
|
| The convolution kernel used to the input tensor |
|
| The convolution kernel used to the input tensor |
|
| The convolution kernels used to the input tensor |
|
| The convolution kernels used to the input tensor |
|
| The convolution kernel used to the input tensor |
|
| The convolution kernels used to the input tensor |
|
| The weight that is used to the old cell state |
|
| The weight that is used to the old cell state |
|
| The weight that is used to the new cell state |
|
| The bias in the forget gate. |
|
| The bias in the input gate |
|
| The bias for creating the data |
|
| The bias in the output gate. |
Physiological data normal and abnormal ranges [23].
| Physiological Parameter | Normal Range | Abnormal Range |
|---|---|---|
| Heart rate | 60–100 | <60 and >100 |
| Pulse rate | 60–100 | <60 and >100 |
| Reparation rate | 12–30 | <12 and >30 |
| SpO2 | 95–100 | <95 and >100 |
| ABPDias | 80–120 | <80 and >120 |
| ABPSys | 90–120 | <90 and >120 |
| ABPMean | 70–100 | <60 and >110 |
Figure 5Process of calculating the point anomaly.
Figure 6Concept for spatial and temporal correlation.
Figure 7Process of calculation the contextual anomaly.
Figure 8The value of the dynamic threshold and the respective point anomalies for various physiological sensors: (a) SPO2 threshold. (b) SPO2 point anomaly. (c) RESP threshold. (d) RESP point anomaly. (e) Pulse threshold. (f) Pulse point anomaly. (g) ABPSys threshold. (h) ABPSys point anomaly. (i) ABPMean threshold. (j) ABPMean point anomaly. (k) ABPDias threshold. (l) ABPDias point anomaly.
Figure 9Loss rate for the various physiological sensors. (a) HR sensor. (b) ABPSys sensor. (c) ABPMean sensor. (d) pulse sensor. (e) RESP sensor. (f) SpO2 sensor. (g) ABPDias sensor.
Figure 10Correlations between physiological data.
Performance evaluation (different datasets with correlated ABPDias with HR sensors).
| Subject No. | Size | Accuracy (%) | Loss (%) | Recall (%) | Precision (%) | F1-Score (%) | Time (s) |
|---|---|---|---|---|---|---|---|
| Subject 1 | 1.02 MB | 99.30% | 0.0018% | 100% | 98.25% | 99.12% | 86 s |
| Subject 2 | 467 KB | 99.59% | 0.0063% | 98.89% | 99.58% | 99.46% | 48 s |
| Subject 3 | 885 KB | 99.89% | 0.0023% | 100% | 100% | 99.88% | 63 s |
| Subject 4 | 1.03 MB | 99.94% | 0.0008% | 100% | 99.93% | 99.96% | 96 s |
Performance evaluation (different datasets with correlated ABPDias with ABPSys sensors).
| Subject No. | Accuracy (%) | Loss (%) | Recall (%) | Precision (%) | F1-Score (%) | Time (s) |
|---|---|---|---|---|---|---|
| Subject 1 | 99.90% | 0.0066% | 100% | 99.94% | 99.92% | 47 s |
| Subject 2 | 99.97% | 0.0053% | 100% | 100% | 99.98% | 87 s |
| Subject 3 | 99.96% | 0.0017% | 100% | 99.42% | 99.95% | 89 s |
| Subject 4 | 99.94% | 0.0036% | 100% | 100% | 99.77% | 145 s |
Performance Evaluation (Different Datasets with Full Correlation).
| Subject No. | Accuracy (%) | Loss (%) | Recall (%) | Precision (%) | F1-Score (%) | Time (s) |
|---|---|---|---|---|---|---|
| Subject 1 | 97.59% | 0.1131% | 99.40% | 95.56% | 97.59% | 60 s |
| Subject 2 | 99.91% | 0.0112% | 99.93% | 99.87% | 99.90% | 60 s |
| Subject 3 | 99.97% | 0.0022% | 100% | 99.86% | 99.96% | 60 s |
| Subject 4 | 97.69% | 0. 028% | 100% | 94.05% | 96.93% | 60 s |
Deep learning models’ parameters.
| Parameters | LSTM Value | CNN Value |
|---|---|---|
| Language | Python | Python |
| Libraries | Pandas, Numpy, Scikitlearn, Matplotlib and Keras | Pandas, Numpy, Scikitlearn, Matplotlib and Keras |
| Train set | 70% | 70% |
| Test set | 30% | 30% |
| Input Layer | 4 | 4 |
| Activation Functions | Rectified Linear Unit (ReLu), and sigmoid | Rectified Linear Unit (ReLu) |
| Dense Layer | 2 | 2 |
| Dropout | 0.20 | 0.20 |
| Optimizer | Adam | Adam |
| Number of Epochs | 30 | 30 |
| Batch size | 72 | 72 |
Comparison with deep learning models.
| Model | Subject | Accuracy (%) | Loss (%) | Recall (%) | Precision (%) | F1-Score (%) | Time (s) |
|---|---|---|---|---|---|---|---|
| Convolutional | Subject 1 | 46.14% | 0.6602% | 100% | 46.14% | 63.14% | 21 s |
| Subject 2 | 68.30% | 0.570% | 100% | 41% | 58% | 73 s | |
| Subject 3 | 70.80% | 0.5678% | 99.93% | 44.56% | 61.63% | 62 s | |
| Subject 4 | 36.53% | 0.5896% | 100% | 36.53% | 53.51% | 40 s | |
| LSTM | Subject 1 | 76.25% | 0.3649% | 99.74% | 46.07% | 63.03% | 90 s |
| Subject 2 | 95.89% | 0.1252% | 98.63% | 41.12% | 58.23% | 317 s | |
| Subject 3 | 94.45% | 0.118% | 100% | 44.57% | 61.66% | 506 s | |
| Subject 4 | 86.73% | 0.279% | 99.14% | 36.36% | 53.20% | 480 s | |
| CNN-LSTM | Subject 1 | 97.59% | 0.1131% | 99.40% | 95.56% | 97.59% | 60 s |
| Subject 2 | 99.97% | 0.0022% | 100% | 99.86% | 99.96% | 60 s | |
| Subject 3 | 99.91% | 0.0012% | 99.93% | 99.87% | 99.90% | 60 s | |
| Subject 4 | 97.69% | 0. 028% | 100% | 94.05% | 96.93% | 60 s |
Comparison with machine learning models.
| Model | Subject | Accuracy (%) | Recall (%) | Precision (%) | F1-Score (%) |
|---|---|---|---|---|---|
| LR | Subject 1 | 95.50% | 94% | 97% | 96% |
| Subject 2 | 96.17% | 99% | 94% | 97% | |
| Subject 3 | 93.33% | 95% | 93% | 94% | |
| Subject 4 | 88.78% | 87% | 96% | 92% | |
| DT | Subject 1 | 99.92% | 63% | 71% | 67% |
| Subject 2 | 100% | 100% | 83% | 91% | |
| Subject 3 | 99.88% | 89% | 99% | 94% | |
| Subject 4 | 100% | 92% | 76% | 83% | |
| RF | Subject 1 | 99.92% | 94% | 97% | 96% |
| Subject 2 | 100% | 99% | 94% | 97% | |
| Subject 3 | 99.91% | 93% | 95% | 94% | |
| Subject 4 | 100% | 66% | 94% | 78% | |
| SVM | Subject 1 | 61.82% | 71% | 63% | 67% |
| Subject 2 | 87.69% | 100% | 83% | 91% | |
| Subject 3 | 93.04% | 99% | 89% | 94% | |
| Subject 4 | 65.70% | 66% | 94% | 78% | |
| Proposed Model | Subject 1 | 97.59% | 99.40% | 95.56% | 97.59% |
| Subject 2 | 99.91% | 99.93% | 99.87% | 99.90% | |
| Subject 3 | 99.97% | 100% | 99.86% | 99.96% | |
| Subject 4 | 97.69% | 100% | 94.05% | 96.93% |