| Literature DB >> 33801663 |
Evangelos Antoniou1, Pavlos Bozios1, Vasileios Christou1,2, Katerina D Tzimourta1,3, Konstantinos Kalafatakis1, Markos G Tsipouras3, Nikolaos Giannakeas1, Alexandros T Tzallas1.
Abstract
Discrimination of eye movements and visual states is a flourishing field of research and there is an urgent need for non-manual EEG-based wheelchair control and navigation systems. This paper presents a novel system that utilizes a brain-computer interface (BCI) to capture electroencephalographic (EEG) signals from human subjects while eye movement and subsequently classify them into six categories by applying a random forests (RF) classification algorithm. RF is an ensemble learning method that constructs a series of decision trees where each tree gives a class prediction, and the class with the highest number of class predictions becomes the model's prediction. The categories of the proposed random forests brain-computer interface (RF-BCI) are defined according to the position of the subject's eyes: open, closed, left, right, up, and down. The purpose of RF-BCI is to be utilized as an EEG-based control system for driving an electromechanical wheelchair (rehabilitation device). The proposed approach has been tested using a dataset containing 219 records taken from 10 different patients. The BCI implemented the EPOC Flex head cap system, which includes 32 saline felt sensors for capturing the subjects' EEG signals. Each sensor caught four different brain waves (delta, theta, alpha, and beta) per second. Then, these signals were split in 4-second windows resulting in 512 samples per record and the band energy was extracted for each EEG rhythm. The proposed system was compared with naïve Bayes, Bayes Network, k-nearest neighbors (K-NN), multilayer perceptron (MLP), support vector machine (SVM), J48-C4.5 decision tree, and Bagging classification algorithms. The experimental results showed that the RF algorithm outperformed compared to the other approaches and high levels of accuracy (85.39%) for a 6-class classification are obtained. This method exploits high spatial information acquired from the Emotiv EPOC Flex wearable EEG recording device and examines successfully the potential of this device to be used for BCI wheelchair technology.Entities:
Keywords: EEG; EPOC Flex; brain–computer interface; electroencephalogram; electrooculogram; eye movement; eye tracking; random forests
Year: 2021 PMID: 33801663 PMCID: PMC8036672 DOI: 10.3390/s21072339
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1The random forests brain–computer interface (RF-BCI) system architecture. Initially, the EEG signal is captured using the headset and undergoes a preprocessing filtering procedure for removing the unwanted noise and separate the bands. The next step is the feature extraction and the association of each dataset entry with an eye movement. Finally, the dataset is used to train the RF classifier.
Figure 2The splitting process on a continuous predictor variable. In the case of a continuous predictor variable, the split is done using a split point. The points where the predictor has a lower value than the split point go to the left. The points with an equal or higher value than the predictor go to the right.
Figure 3Comparison results bar plot. This bar plot visualizes the test set’s experimental results between the proposed method and seven other popular classification algorithms. It can be seen that the RF method used in RF-BCI produced significantly better results since it was 6.85% better than the 2nd best algorithm (SVM).
Results in terms of accuracy for each eye movement and visual state for random forests.
| Predicted Values | ||||||
|---|---|---|---|---|---|---|
| Class A | Class B | Class C | Class D | Class E | Class F | |
| class A | 96.77 | 3.23 | 0 | 0 | 0 | 0 |
| class B | 0 | 95.35 | 0 | 0 | 2.33 | 2.33 |
| class C | 0 | 0 | 78.05 | 19.51 | 2.44 | 0 |
| class D | 0 | 0 | 24.39 | 75.61 | 0 | 0 |
| class E | 0 | 6.45 | 3.23 | 0 | 83.87 | 6.45 |
| class F | 0 | 0 | 3.13 | 3.13 | 9.38 | 84.38 |
A comparison of the performances of the various methods proposed in the literature for wheelchair navigation.
| Study | Control Method | No. of Participants | Device | No. of Channels | Methodology | Classification Accuracy |
|---|---|---|---|---|---|---|
| Zgallai et al. [ | Mental commands | 10 | Emotiv EPOC+ | 14 | Fast Fourier Transform (FFT), | 96.29% |
| Sim et al. [ | Mental commands | 5 | Emotiv EPOC+ | 14 | FFT, (4-class) | 90.00% |
| Tanaka et al. [ | Mental commands | 6 | Not reported | 13 | Filtering, FFT, standard deviation, correlation coefficient, (2-class) | 80.00% |
| Ben Taher et al. [ | Eye movement | Not reported | Emotiv EPOC+ | 14 | 4-class | Not reported |
| Aziz et al. [ | Eye movement | 20 | g-USBAMP | 4 | Filtering, | 98.00% |
| This study | Eye movement | 10 | Emotiv EPOC Flex | 32 | Filtering, | 85.39% |