| Literature DB >> 35755757 |
Ijaz Ahmad1,2,3, Xin Wang1,2,3, Mingxing Zhu2,4, Cheng Wang1,2,3, Yao Pi5, Javed Ali Khan6, Siyab Khan7, Oluwarotimi Williams Samuel1,3, Shixiong Chen1,3, Guanglin Li1,3.
Abstract
Epileptic seizure is one of the most chronic neurological diseases that instantaneously disrupts the lifestyle of affected individuals. Toward developing novel and efficient technology for epileptic seizure management, recent diagnostic approaches have focused on developing machine/deep learning model (ML/DL)-based electroencephalogram (EEG) methods. Importantly, EEG's noninvasiveness and ability to offer repeated patterns of epileptic-related electrophysiological information have motivated the development of varied ML/DL algorithms for epileptic seizure diagnosis in the recent years. However, EEG's low amplitude and nonstationary characteristics make it difficult for existing ML/DL models to achieve a consistent and satisfactory diagnosis outcome, especially in clinical settings, where environmental factors could hardly be avoided. Though several recent works have explored the use of EEG-based ML/DL methods and statistical feature for seizure diagnosis, it is unclear what the advantages and limitations of these works are, which might preclude the advancement of research and development in the field of epileptic seizure diagnosis and appropriate criteria for selecting ML/DL models and statistical feature extraction methods for EEG-based epileptic seizure diagnosis. Therefore, this paper attempts to bridge this research gap by conducting an extensive systematic review on the recent developments of EEG-based ML/DL technologies for epileptic seizure diagnosis. In the review, current development in seizure diagnosis, various statistical feature extraction methods, ML/DL models, their performances, limitations, and core challenges as applied in EEG-based epileptic seizure diagnosis were meticulously reviewed and compared. In addition, proper criteria for selecting appropriate and efficient feature extraction techniques and ML/DL models for epileptic seizure diagnosis were also discussed. Findings from this study will aid researchers in deciding the most efficient ML/DL models with optimal feature extraction methods to improve the performance of EEG-based epileptic seizure detection.Entities:
Mesh:
Year: 2022 PMID: 35755757 PMCID: PMC9232335 DOI: 10.1155/2022/6486570
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1The illustration of seizure types and their subtypes.
Figure 2The proportions of accepted paper for the review using different citation database.
Figure 3A framework of how the systematic review was conducted.
Figure 4A block diagram of epileptic seizure detection using EEG signals and machine/deep learning techniques.
Presents full description of publicly available EEG dataset datasets for epilepsy seizure detection.
| Dataset | Recording | No. of seizure | Sampling frequency | Times | No. of patients |
|---|---|---|---|---|---|
| CHB-MIT [ | Scalp EEG | 163 | 256 | 844 | 22 |
| Bonn [ | Surface and IEEG | NA | 173.61 | 39 m | 10 |
| Freiburg [ | IEEG | 87 | 256 | 708 | 21 |
| Kaggle [ | IEEG | 48 | 400/5 KHz | 627 | 5 dogs, 2 patients |
| Zenodo [ | Scalp EEG | 460 | 256 | 74 m | 79 neonatal |
| Bern Barcelona [ | IEEG | 3750 | 512 | 83m | 5 |
Figure 5Represents various datasets in different studies for epilepsy seizure detection using ML/DL techniques.
Figure 6Different commonly adopted feature extraction methods for EEG signal characterization.
Efficient polynomial-based methods for the features selection of EEG epileptic seizure detection.
| Feature selection methods | Description |
|---|---|
| [ | It was implemented to compress highly correlated features into a lower-dimensional subspace and use in various pattern recognition applications, including EEG signal classification |
| [ | Used to decrease the dimensionality of nonlinear data with a high-dimensionality of complexity to a lower-dimensional subspace. It is extensively utilized to present large amounts of high-dimensional biological data |
| [ | Used to handle the problem of nonlinear dimensionality reduction and useful for data compression using electroencephalogram (EEG) signals |
| [ | Process multivariate data representing the vast database samples as EEG signal is composed of various random signals |
| [ | One of the most frequently utilized methods for extracting the nonlinear features uses the EEG signal. GDA is a highly effective method for extracting the nonlinear features of EEG signal data because generalized discriminants are calculated by mapping the training data in large dimensions of space using a kernel function |
A review of recent research that applied the 2D-CNN model for seizure prediction with their corresponding accuracies and limitations.
| Authors | Machine-learning approaches | Feature selection methods | Dataset | Performance metrics | Limitations | Accuracy (%) |
|---|---|---|---|---|---|---|
| Bizopoulos et al. [ | SoftMax, standard networks | 2D and 3D phase space presents the intrinsic mode and functions | BONN | Overall accuracy | Low detection accuracy | 85.30 |
| Antoniades et al. [ | LR, 2D-CNN | Time-domain | BONN | Overall accuracy | — | 87.50 |
| Park et al. [ | SoftMax, 2D-CNN | 2D, 3D phase space presents the intrinsic mode and functions | CHB-MIT, SNUH-HYU data | Spec, sens, time difference | Low sens, spec | 90.58 |
| Sui et al. [ | SoftMax, 2D-CNN | FT | Kaggle | Overall accuracy | High time complexity | 91.18 |
| Turk and Ozerdem [ | Softmax, 2D-CNN | Frequency-time domain, CWT | Freiburg | Spec, sens, acc, F-measure | Low spec for multi-class | 93.60 |
| Faust et al. [ | Softmax, 2D-CNN | Wavelet transformations (DWT) | Bern-Barcelona data | Energy, frequency | Low accuracy | 94.50 |
| Tian et al. [ | MV-TSK-FS, 2D-CNN | FFT, WPD | CHB-MIT | Overall accuracy | - | 95.33 |
| Lecun et al. [ | Res-CNN | Conventional feature extraction method | BONN | Overall acc | - | 95.70 |
| LeCun and Triesch [ | Softmax, 2D-CNN | Feature extracts from CNN | Bern Barcelona | Overall accuracy | High detection time | 95.90 |
| San-Segundo et al. [ | SoftMax, 2D-CNN | DWT | CHB-MIT | Class acc | High training time | 96.10 |
| Akut [ | Sigmoid, 2D-CNN | FFT, WPD | Kaggle | Spec, sens | High training time | 96.15 |
Figure 7Comparison of accuracies (%) versus authors introducing 2D-CNN models for seizure detection.
A review of recent research that applied the LSTM-RNN model for seizure prediction with their corresponding accuracies.
| Authors | Machine learning approaches | Feature selection methods | Dataset | Performance metrics | Limitations | Accuracy (%) |
|---|---|---|---|---|---|---|
| Yao et al. [ | SoftMax, LSTM | Independent RNN | CHB-MIT | Sen, spec, Prec | Low sens, prec | 88.80 |
| Chen et al. [ | SoftMax, LSTM | Wavelet transformations (DWT) | Zenodo | Pre, spec, class | Low prec | 90.00 |
| Chen et al. [ | SoftMax, LSTM | Wavelet transformations (DWT) | BONN | Overall accuracy | High detection time | 91.82 |
| Hussein et al. [ | SoftMax, LSTM | Time domain, time-frequency domain | Fribourg | Sen, spec | — | 92.75 |
| Jaafar and Mohammad [ | SoftMax, LSTM | Independent RNN | Freiburg data | Overall accuracy | High training time | 93.75 |
| Talathi and Vartak [ | RNN, GRU | RNNs | BONN | Class accuracy | High time complexity | 94.00 |
| Ahmed-Aristizabal [ | SoftMax, LSTM | Computer-based analytical approaches | Mater advanced epilepsy Unit | Overall accuracy | — | 95.00 |
| Yao et al. [ | SoftMax, LSTM | Independent RNN | Bern Barcelona | Sen, spec, Prec | High time complexity | 96.00 |
| Hussein [ | SoftMax, LSTM | Fully connected (FC) RNN | Zenodo | Sen, spec | High training time | 96.00 |
Figure 8Comparison of accuracies (%) versus authors introducing LSTM-RNNs models for seizure detection.
A review of recent research that applied the CNN-RNNs model for seizure prediction with their corresponding accuracies.
| Authors | ML/DL approaches | Feature selection methods | Dataset | Performance metrics | Limitations | Accuracy (%) |
|---|---|---|---|---|---|---|
| Fang et al. [ | ST-GRU ConvNets | Time-domain | CHB-MIT | Latency | Low accuracy | 77.30 |
| Ravi Prakash et al. [ | Sigmoid, 1D-CNN-LSTM | Time-domain features | Fribourg | Sen, spec | Low sens, spec | 83.05 |
| Ravi Prakash et al. [ | CNN-RNN | 2D, 3D phase space presents the intrinsic mode and functions | MAEU data | Overall accuracy | — | 90.22 |
| Ahmedt Aristizabal et al. [ | Sigmoid, 2D CNN-LSTM | Time domain features | TUH data | Overall accuracy | High detection time | 92.50 |
| Roy et al. [ | Sigmoid, 2D CNN-LSTM | Time-frequency domain feature | Kaggle | Sen, spec | High training time | 93.00 |
| Liang et al. [ | Softmax, 1D CNN-GRU | 2D, 3D phase space presents the intrinsic mode and functions | Bern Barcelona | Overall accuracy | High time complexity | 94.16 |
| Choi et al. [ | ID-CNN biGRU | Frequency domain | CHB-MIT | Sensitivity | High training time | 94.40 |
Figure 9Comparison of accuracies (%) versus authors introducing CNN-RNN models for seizure detection.
A review of recent research that applied AE in seizure detection with their corresponding accuracies.
| Authors | Machine learning approaches | Feature selection methods | Dataset | Performance metrics | Limitations | Accuracy (%) |
|---|---|---|---|---|---|---|
| Gasparini et al. [ | SoftMax, SAE | Time-frequency, CWT | Reggio Calabria data | Sen, spec | Low Sen, Spec, Acc | 86.50 |
| Singh and Malhotra. [ | SoftMax, SAE | AE and SE | BONN | Sen, spec, acc | Low Sen, Spec, Acc | 88.80 |
| Yuan et al. [ | SoftMax, SSpDAE | SAE, six features | Zenodo | ROC, PR, F-measure | — | 90.64 |
| Yuan et al. [ | SoftMax, SpDAE | Time-frequency | CHB-MIT | F1-measure, Confusion Matrix | Low detection acc | 90.82 |
| Hosseini et al. [ | SoftMax, SpAE | PCA | Zenodo | Pre sen, FPR FNR | High FNR | 91.00 |
| Karim et al. [ | SoftMax, SAE | DWT | BONN | Confusion matrix | Low prec | 91.00 |
| Yuan et al. [ | SoftMax, SAE | AE and SE | CHB-MIT | Pre, sen, F-measure | — | 92.61 |
| Sharathappriyaa et al. [ | SoftMax, AE | HWPT, FD | Fribourg | Sen, spec | High time complexity | 92.67 |
| Karim et al. [ | SoftMax, SpAE | AE and SE | Fribourg | Confusion Matrix | High detection time | 93 |
| Karim et al. [ | SoftMax, DSAE | ESD function | Kaggle | Sen, spec | — | 94 |
| Wang et al. [ | SoftMax, SSpDAE | AE and NSP | BONN | Sen, spec | Prec not mentioned | 95 |
Figure 10Comparative study of accuracy (%) versus authors introducing AE model for seizure detection.
Recently applied ML (SVM, ANN, and KNN) for seizure detection with their corresponding performances.
| Authors | Machine learning approaches | Feature selection methods | Dataset | Performance metrics | Limitations | Accuracy (%) |
|---|---|---|---|---|---|---|
| Logesparan et al. [ | SVM, ANN | Line length feature | CHB-MIT | ROC | Low accuracy | 52 |
| Zeiler Fergus [ | QDA, DT, KNN, SVM | Time-frequency | BONN | Sen, spec | Low sen, pres | 85 |
| Birjandtalab et al. [ | ANN | Spectral power | CHB-MIT | F-measure | High detection high | 86 |
| Chen et al. [ | SVM | DWT | BONN | Confusion Matrix | Low sen, pres | 86.83 |
| Parvez and Paul [ | LS-SVM | IMF, DCT-DWT, DCT, SVD | Freiburg | Spec, sen, Acc | Low sen, pres for binary classification | 91.36 |
| Guo and DiPietro [ | K-NN | Genetic programming | BONN | Class Acc | Low accuracy | 93.50 |
| Nicolaou and Georgiou [ | SVM | Permutation entropy | CHB-MIT | Pre, Rec, F-measure | Low prec and accuracy | 93.55 |
| Ahmad et al. [ | SVM | DWT | CHB-MIT | Avg | — | 94.8 |
| Zhang et al. [ | ELM, SVM | AE and SE | BCI Lab | Class accuracy | High time complexity | 95.58 |
| Shoeb and Guttag [ | SVM | Time-frequency | CHB- MIT | Sensitivity (sen) | — | 96 |
| Chen et al. [ | Naïve Bayes, SVM | Energy, variance, entropy, RMS | CHB-MIT | Pre, Rec, F-measure | Low pre | 96.55 |
| Raghu et al. [ | RF, KNN, adaboost | Time-frequency | Bern-Barcelona | Sen, pre, NPR, ROC | NFR not mentioned | 97.6 |
| Mursalin et al. [ | KNN, SVM, RF | 15-features | BONN | Acc, sen, spec | — | 98 |
| Sharma et al. [ | LS-SVM | 2D, 3D phases, the intrinsic mode, and functions | BONN | Overall Acc | — | 98.60 |
| Amin et al. [ | Naïve bayes, SVM, KNN, MLP | Energy | EPILEPSY | Class Acc | — | 98.75 |
| Satapathy et al. [ | Neural network, SVM | CWT, DWT | BONN | Overall Acc | High detection time | 99.1 |
| Zabihi et al. [ | SVM | Time-frequency | CHB-MIT | Sen, spec | High time complexity | 99.32 |
| Hassan and Subasi [ | SVM | DWT | BONN | Class Acc | — | 99.38 |
| Fasil and Rajesh. [ | SVM | Energy | BONN, Barcelona | Class Acc | — | 99.5 |
| Chen et al. [ | LS-SVM | Entropies types | BONN | Spec, Acc, sen | — | 99.58 |
| Selvakumari et al. [ | LS-SVM | DWT, FFT | Class Acc | BONN | High time complexity | 100 |
| Lahmiri and Shumel [ | KNN and GHE | BONN | Class Acc | — | 100 | |
| Kumar et al. [ | DWT based approximate entropy ANN, SVM | DWT based approximate entropy | CHB-MIT | Overall Acc | High time complexity | 100 |
| Tzallas et al. [ | ANN | Time-frequency features | BONN | Pre, Rec, F-measure | — | 100 |
Figure 11Comparative study of accuracy (%) versus authors introducing the SVM model for seizure detection.
Figure 12Comparative study of accuracy (%) versus authors introducing the ANN model for seizure detection.
Figure 13Comparative study of accuracy (%) versus authors introducing the KNN model for seizure detection.
Figure 14Comparative study of accuracy (%) versus authors introducing RF model for seizure detection.
A review of recent research applied random forest in seizure detection with their corresponding accuracies.
| Authors | Machine learning approaches | Feature selection methods | Dataset | Performance metrics | Limitations | Accuracy (%) |
|---|---|---|---|---|---|---|
| Birjandtalab et al. [ | Random forest-KNN | Spectral power | CHB-MIT | Sen, F-measure, prec, | Low sens, spec | 80.87 |
| Donos et al. [ | Random forest | Time, frequency | EPILEPSY | Sensitivity | Spec not mentioned | 93.8 |
| Siddiqui et al. [ | Random forest, boosting, decision forest | Nine statistical features | Bern Barcelona | Pre, Rec, F-measure | High time complexity | 96.67 |
| Wang et al. [ | Random forest classifiers | Std, dev, energy, energy,STFT, mean | BONN | Class Acc | Low sens, spec for multi-class | 96.7 |
| Lee and Kim [ | Random forest, SVM | Frequency, 10-time | UCI | ROC-AUC | — | 98 |
| Sharma et al. [ | Random forest | IMF | Kaggle | Sen, spec, Acc | Sen, spec not mentioned | 98.4 |
| Mursalin et al. [ | Random forest | DWT, entropy | Fribourg | Class Acc | — | 98.45 |
| Mursalin et al. [ | Random forest | DWT, entropy | Zenodo | Class Acc | Sen, Spec not mentioned | 98.45 |
| Alickovic et al. [ | ANN, random forest, SVM, KNN | Power, mean, kurtosis, absolute mean std dev, skewness | CHB-MIT | Sen, spec, Acc | Time complexity | 100 |
| Wang et al. [ | Forest CERN | 9-statistical features | BONN, CHB-MIT | Class Acc | — | 100 |
| Hosseini et al. [ | Random forest classifiers | L1-penalized robust regression | BONN, CHB-MIT | Class Acc | — | 100 |