| Literature DB >> 35465467 |
Sateesh Kumar Reddy Chirasani1, Suchetha Manikandan2.
Abstract
Electroencephalogram (EEG) is a common diagnostic tool for measuring the seizure activity of the brain. There are many deep learning techniques introduced to analyze EEG. These methods show phenomenal results, although they are limited to computational complexity. Our objective was to develop a novel algorithm that gives maximum classification accuracy with a minor computational complexity. In this view, we have introduced a novel convolutional architecture with an integration of a hierarchical attention mechanism. The model comprises three parts: Feature extraction layer, which uses to extract the convoluted feature map; hierarchical attention layer, which is used to obtain weighted hierarchical feature map; classification layer, which uses weighted features for classification of healthy and seizure subjects. The proposed model can extract significant information from the EEG signal to classify seizure subjects, and it is compared with a few existing deep convolutional algorithms through experimentation. The experimental outcomes show that the proposed model has higher accuracy with less computational time.Entities:
Keywords: Attention mechanism; Convolutional neural network; Electroencephalogram (EEG); Epilepsy; Feature selection; Support vector machine
Year: 2022 PMID: 35465467 PMCID: PMC9012945 DOI: 10.1007/s00500-022-07122-8
Source DB: PubMed Journal: Soft comput ISSN: 1432-7643 Impact factor: 3.732
Fig. 1Overall architecture of proposed methodology
Fig. 2Proposed hierarchical attention mechanism
Different cases for classification
| Cases | Grouping | Classes |
|---|---|---|
| Case I | Set A | Healthy |
| Set E | Ictal | |
| Case II | Set A | Healthy |
| Set C | Inter-Ictal | |
| Case III | Set ABCD | Healthy |
| Set E | Ictal |
The classification performance of different convolutional algorithms
| Cases | Parameters | Various convolutional algorithms | ||
|---|---|---|---|---|
| Conventional CNN | Attention-based CNN | Proposed hierarchical Attention-based CNN | ||
| Case I | Accuracy (%) | 96.19 | 97.2826 | 98.33 |
| Sensitivity | 0.9474 | 0.9780 | 0.9800 | |
| Specificity | 0.9775 | 0.9677 | 0.9700 | |
| Precision | 0.9783 | 0.9674 | 0.9780 | |
| F-measure | 0.9626 | 0.9727 | 0.9800 | |
| MCC | 0.9244 | 0.9457 | 0.9600 | |
| Case II | Accuracy (%) | 95.11 | 95.11 | 95.56 |
| Sensitivity | 0.9368 | 0.9462 | 0.9667 | |
| Specificity | 0.9663 | 0.9560 | 0.9444 | |
| Precision | 0.9674 | 0.9565 | 0.9457 | |
| F-measure | 0.9519 | 0.9514 | 0.9560 | |
| MCC | 0.9027 | 0.9022 | 0.9113 | |
| Case III | Accuracy (%) | 95.65 | 96.20 | 97.21 |
| Sensitivity | 0.9375 | 0.9570 | 0.9775 | |
| Specificity | 0.9773 | 0.9670 | 0.9667 | |
| Precision | 0.9783 | 0.9674 | 0.9667 | |
| F-measure | 0.9574 | 0.9622 | 0.9721 | |
| MCC | 0.9139 | 0.9240 | 0.9442 | |
The average classification performance of different algorithms
| Parameters | Various convolutional algorithms | ||
|---|---|---|---|
| Conventional CNN | Attention-based CNN | Proposed hierarchical Attention-based CNN | |
| Accuracy (%) | 95.65 | 96.20 | 97.03 |
| Sensitivity | 0.9406 | 0.9604 | 0.9747 |
| Specificity | 0.9737 | 0.9636 | 0.9604 |
| Precision | 0.9746 | 0.9638 | 0.9634 |
| F-measure | 0.9573 | 0.9621 | 0.9694 |
| MCC | 0.9137 | 0.9240 | 0.9385 |
Comparative study of proposed method with traditional methods
| Author | Method | Cases | Accuracy (%) |
|---|---|---|---|
|
Lee et al. ( | WT, phase-space reconstruction and ED + NEWFM | A E | 98.17 |
|
Nicolaou and Georgiou ( | Permutation entropy + SVM classifier | A E | 93.55 |
|
Riaz et al. ( | Time-frequency features in EMD + SVM | A E | 99 |
| D E | 93 | ||
| AB CD E | 96 | ||
| A D E | 85 | ||
| ABCD E | 83 | ||
|
Lin et al. ( | Stacked sparse autoencoder + softmax classifier | A E | 95.5 |
| A D | 86.42 | ||
|
Chandaka et al. ( | Cross-correlation-aided SVM classifier | A E | 95.96 |
| Proposed | Hierarchical attention-based deep learning + SVM | A E | 98.33 |
| A C | 95.56 | ||
| ABCD E | 97.21 |
Fig. 3Roc curve between healthy and ictal class
Fig. 4Computational time comparison of various convolutional neural network approaches