| Literature DB >> 29849544 |
Bartosz Binias1, Dariusz Myszor2, Krzysztof A Cyran2.
Abstract
This work considers the problem of utilizing electroencephalographic signals for use in systems designed for monitoring and enhancing the performance of aircraft pilots. Systems with such capabilities are generally referred to as cognitive cockpits. This article provides a description of the potential that is carried by such systems, especially in terms of increasing flight safety. Additionally, a neuropsychological background of the problem is presented. Conducted research was focused mainly on the problem of discrimination between states of brain activity related to idle but focused anticipation of visual cue and reaction to it. Especially, a problem of selecting a proper classification algorithm for such problems is being examined. For that purpose an experiment involving 10 subjects was planned and conducted. Experimental electroencephalographic data was acquired using an Emotiv EPOC+ headset. Proposed methodology involved use of a popular method in biomedical signal processing, the Common Spatial Pattern, extraction of bandpower features, and an extensive test of different classification algorithms, such as Linear Discriminant Analysis, k-nearest neighbors, and Support Vector Machines with linear and radial basis function kernels, Random Forests, and Artificial Neural Networks.Entities:
Mesh:
Year: 2018 PMID: 29849544 PMCID: PMC5914152 DOI: 10.1155/2018/2703513
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1Positions of electrodes in the standard 10-10 electrode montage system (own source based on [19]).
Figure 2Interior of used flight simulator (cockpit) and a simulation screen.
Figure 3Concept of pre-event and event-related class trials extraction (own source).
Figure 4Comparison of classifier performance obtained for all subjects.
Linear Discriminant Analysis: accuracy of classification achieved for each subject (mean accuracy 73.01%).
| Subject | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|---|
| Accuracy | 79.41% | 84.21% | 78.57% | 82.69% | 66.67% | 69.57% | 68.18% | 73.21% | 66.67% | 60.87% |
k-Nearest Neighbors: accuracy of classification achieved for each subject (mean accuracy 69.45%).
| Subject | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|---|
| Accuracy | 79.41% | 84.21% | 59.52% | 80.77% | 66.67% | 56.52% | 68.18% | 80.36% | 66.67% | 52.17% |
Support Vector Machines with linear kernel: accuracy of classification achieved for each subject (mean accuracy 67.29%).
| Subject | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|---|
| Accuracy | 76.47% | 84.21% | 64.29% | 80.77% | 63.89% | 56.52% | 68.18% | 69.64% | 52.38% | 56.52% |
Support Vector Machines with radial basis function kernel: accuracy of classification achieved for each subject (mean accuracy 69.32%).
| Subject | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|---|
| Accuracy | 73.53% | 84.21% | 73.81% | 84.62% | 69.44% | 56.52% | 73.21% | 65.91% | 61.90% | 50.00% |
Random Forest: accuracy of classification achieved for each subject (mean accuracy 68.72%).
| Subject | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|---|
| Accuracy | 76.47% | 86.84% | 54.76% | 84.62% | 69.44% | 60.87% | 65.91% | 78.57% | 61.90% | 47.83% |
Artificial Neural Networks: accuracy of classification achieved for each subject (mean accuracy 77.77%).
| Subject | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
|---|---|---|---|---|---|---|---|---|---|---|
| Accuracy | 88.23% | 92.10% | 78.57% | 86.53% | 77.77% | 67.39% | 68.18% | 80.35% | 69.04% | 69.56% |
Accuracy of classification achieved for each subject.
| Classifier |
|
|
|
|
|---|---|---|---|---|
| LDA | 73.01% | 7.85% | 66.67% | 79.41% |
| kNN | 69.45% | 11.28% | 59.52% | 80.36% |
| SVMLIN | 67.29% | 10.72% | 56.52% | 76.47% |
| SVMRBF | 69.32% | 11.12% | 61.90% | 73.81% |
| RF | 68.72% | 12.85% | 60.87% | 78.57% |
| NN | 77.77% | 9.08% | 69.04% | 86.53% |
Figure 5A summary of general accuracies that were obtained for each subject.