| Literature DB >> 30884880 |
Atia Javaid1, Nadeem Javaid2, Zahid Wadud3, Tanzila Saba4, Osama E Sheta5, Muhammad Qaiser Saleem6, Mohammad Eid Alzahrani7.
Abstract
Decision fusion is used to fuse classification results and improve the classification accuracy in order to reduce the consumption of energy and bandwidth demand for data transmission. The decentralized classification fusion problem was the reason to use the belief function-based decision fusion approach in Wireless Sensor Networks (WSNs). With the consideration of improving the belief function fusion approach, we have proposed four classification techniques, namely Enhanced K-Nearest Neighbor (EKNN), Enhanced Extreme Learning Machine (EELM), Enhanced Support Vector Machine (ESVM), and Enhanced Recurrent Extreme Learning Machine (ERELM). In addition, WSNs are prone to errors and faults because of their different software, hardware failures, and their deployment in diverse fields. Because of these challenges, efficient fault detection methods must be used to detect faults in a WSN in a timely manner. We have induced four types of faults: offset fault, gain fault, stuck-at fault, and out of bounds fault, and used enhanced classification methods to solve the sensor failure issues. Experimental results show that ERELM gave the first best result for the improvement of the belief function fusion approach. The other three proposed techniques ESVM, EELM, and EKNN provided the second, third, and fourth best results, respectively. The proposed enhanced classifiers are used for fault detection and are evaluated using three performance metrics, i.e., Detection Accuracy (DA), True Positive Rate (TPR), and Error Rate (ER). Simulations show that the proposed methods outperform the existing techniques and give better results for the belief function and fault detection in WSNs.Entities:
Keywords: ELM; KNN; RELM; SVM; Wireless Sensor Networks; belief function; machine learning classifiers
Year: 2019 PMID: 30884880 PMCID: PMC6471498 DOI: 10.3390/s19061334
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Related work.
| Classifiers | Goals | Limitations | Techniques and Methods |
|---|---|---|---|
| KNN, NB, neural network, decision tree [ | Classification of forest high-resolution remote-sensing image | No association between data mining and conflict management | Generic framework and automatic method for weighting factors |
| Data distribution-based intuitionistic fuzzy set construction algorithm [ | Multi-attribute decision fusion model | Negative and positive non-ideal solutions are not elaborated | Intuitionistic fuzzy set |
| Multi-input and multi-output decision fusion classifier [ | Needed amount of energy for WSN | Performance decreases with high SNR | On-off keying scenario |
| Fourier-based, stream-based classifiers [ | Handle energy dissipation | Did not exploit parallelism | Deep tree-based structures |
| CNN, SVM [ | Golf swing classification method | No discussion about relevancy, redundancy for sensors | Architecture of vanilla convolutional neural network |
| Gaussian NB [ | Processing of sensor dataset using various machine learning algorithms | Feature selection is not used | Supervised machine learning |
| PCA, ANN, SVM [ | Classification of ECG signal | No feature selection | Adaptive filter |
| ELM [ | Enhanced ELM | Applications of the proposed methods are not discussed | Regularized-ELM, |
| MLP [ | To detect state of charge | Their is no regression method used | Structured MLP architecture |
| KNN [ | To predict data accurately | Temporal correlation is lacking | Linear regression model to describe the spatial correlation |
| KNN [ | To develop a reliable spectrum sensing scheme | No feature selection or extraction | Majority voting |
| SVM [ | Fault detection | Regression is not elaborated | Decision tree classification protocol |
| SVM [ | Fault detection | Overfitting problem | K-fold cross-validation technique |
| SVM, NB, and gradient lifting decision tree [ | Fault detection | No DA | Non-linear mapping algorithm |
Figure 1System model of belief function theory.
Figure 2System model of fault detection.
ER.
| Fault Types | EKNN | EELM | ESVM | ERELM |
|---|---|---|---|---|
| Offset Fault | 4.52% | 4.79% | 4.81% | 4.86% |
| Gain Fault | 4.90% | 5.23% | 5.96% | 6.13% |
| Stuck-at Fault | 4.77% | 5.13% | 6.9% | 5.65% |
| Out of Bounds | 4.81% | 4.91% | 5.82% | 6.77% |
Figure 3TPR of ERELM, ESVM, EELM, and EKNN.
Figure 4Normal and abnormal sensor data.
Figure 5Normal and abnormal sensor data.
Figure 6Training data.
Figure 7Testing data.
Figure 8KNN.
Figure 9EKNN.
Figure 10ELM.
Figure 11EELM.
Figure 12SVM.
Figure 13ESVM.
Figure 14RELM.
Figure 15ERELM.
Figure 16ERELM.
Figure 17ESVM.
Figure 18EELM.
Figure 19EKNN.
DA with 10% induced faults.
| Fault Types | ERELM | ESVM | EELM | EKNN |
|---|---|---|---|---|
| Offset Fault | 97.9% | 93.3% | 92.3% | 92.2% |
| Gain Fault | 96.5% | 90.0% | 91.5% | 80.0% |
| Stuck-at Fault | 97.3% | 91.6% | 95.7% | 90.5% |
| Out of Bounds | 98.8% | 98.4% | 91.6% | 90.4% |
DA with 20% induced faults.
| Fault Types | ERELM | ESVM | EELM | EKNN |
|---|---|---|---|---|
| Offset Fault | 98.8% | 94.1% | 81.2% | 82.0% |
| Gain Fault | 98.1% | 93.5% | 84.5% | 81.2% |
| Stuck-at Fault | 98.2% | 93.0% | 83.5% | 81.0% |
| Out of Bounds | 98.0% | 93.2% | 83.6% | 81.4% |
DA with 30% induced faults.
| Fault Types | ERELM | ESVM | EELM | EKNN |
|---|---|---|---|---|
| Offset Fault | 97.4% | 97.9% | 83.0% | 80.0% |
| Gain Fault | 95.0% | 95.0% | 82.8% | 73.0% |
| Stuck-at Fault | 97.1% | 98.9% | 83.3% | 80.2% |
| Out of Bounds | 97.5% | 97.9% | 82.9% | 80.1% |
DA with 40% induced faults.
| Fault Types | ERELM | ESVM | EELM | EKNN |
|---|---|---|---|---|
| Offset Fault | 97.2% | 93.2% | 81.1% | 62.8% |
| Gain Fault | 97.1% | 93.2% | 81.0% | 62.4% |
| Stuck-at Fault | 95.8% | 90.5% | 80.4% | 61.0% |
| Out of Bounds | 97.0% | 94.8% | 70.0% | 63.2% |
DA with 50% induced faults.
| Fault Types | ERELM | ESVM | EELM | EKNN |
|---|---|---|---|---|
| Offset Fault | 97.8% | 84.8% | 48.3% | 58.8% |
| Gain Fault | 97.0% | 85.2% | 70.1% | 58.8% |
| Stuck-at Fault | 97.3% | 84.5% | 48.3% | 58.9% |
| Out of Bounds | 97.4% | 84.4% | 48.9% | 58.9% |