| Literature DB >> 30382882 |
Morteza Zangeneh Soroush1, Keivan Maghooli2, Seyed Kamaledin Setarehdan3, Ali Motie Nasrabadi4.
Abstract
BACKGROUND: Emotion recognition is an increasingly important field of research in brain computer interactions.Entities:
Keywords: Brain computer interactions; Dempster Shafer theory; Emotion identification; Independent component analysis; Local subset feature selection; Machine learning methods
Mesh:
Year: 2018 PMID: 30382882 PMCID: PMC6208176 DOI: 10.1186/s12993-018-0149-4
Source DB: PubMed Journal: Behav Brain Funct ISSN: 1744-9081 Impact factor: 3.759
Fig. 1Block diagram of the proposed approach for emotion detection
Fig. 2Arousal-valence plane and label distribution for DEAP dataset
Fig. 3Average score of each ICA component for all of trials
Fig. 4Average score of 32 components of ICA for: a Q1, b Q2, c Q3, d Q4
Most common features in emotion recognition through EEG
| # | Feature description | Abbreviation | Explained in |
|---|---|---|---|
| 1 | Correlation dimension | CD | [ |
| 2 | Fractal dimension | FD | [ |
| 3 | Largest Lyapunov exponent | LLE | [ |
| 4 | Sample entropy | SpEn | [ |
| 5 | Recurrence rate | RR | [ |
| 6 | Determinism | DET | [ |
| 7 | Average diagonal line length | L | [ |
| 8 | Entropy | ENT | [ |
| 9 | Differential entropy | DeEn | [ |
Fig. 5An instance from different localities and features in tree representation. The sub-tree r corresponds to the compound locality cl consisting of two single localities [42]
Fig. 6Flowchart of the proposed FBS-based emotion recognition system
A comparison among source separation algorithms with respect to different classifiers
| Runica | SOBI | COMBI | JADE | p-value | |
|---|---|---|---|---|---|
| Index channels | 14 | 16 | 17 | 15 | – |
| MLP | |||||
| Accuracy (%) | 77.16 | 79.57 | 76.33 | 80.28 | 0.0646 |
| Time (min) | 118.46 | 120.78 | 116.89 | 113.45 | |
| KNN | |||||
| Accuracy (%) | 79.11 | 81.46 | 77.16 | 73.28 | 0.0894 |
| Time (min) | 112.56 | 110.32 | 118.96 | 103.52 | |
| Bayes | |||||
| Accuracy (%) | 82.57 | 84.65 | 78.24 | 79.67 | 0.0743 |
| Time (min) | 121.32 | 122.85 | 119.65 | 118.45 | |
| SVM | |||||
| Accuracy (%) | 84.65 | 86.78 | 85.96 | 83.13 | 0.0531 |
| Time (min) | 115.43 | 112.47 | 108.75 | 111.65 | |
| Modified DST | |||||
| Accuracy (%) | 88.49 |
| 86.72 | 89.32 |
|
| Time (min) | 122.25 | 120.82 | 123.67 | 126.95 | |
| p-value | 0.0631 |
|
| 0.0787 | |
A comparison among the values of the selected electrodes in each quarter with respect to source separation algorithms
| Q1 | Q2 | Q3 | Q4 | Intersection | |
|---|---|---|---|---|---|
| Runica | Fp1, Fp2, Fz, F4, F3, F8, Cz, C4, C3, Pz, P3, T4 | Pz, P4, P3, F4 O1, T4, F3 | T3, T4, C3, T6, P3, T5, P4, F4, O1 | P3, T4, F4, Pz, P4, O1, O2, T6, T5, F3 | F3, F4, O1, T4 |
| SOBI | Fp1, Fz, F4, F3, F8, Cz, P4, Cz, Pz, P3, O2 | Pz, P4, P3, O2, Cz, F3 | F3, T4, C3, T6, P3, T5, Cz, O2 | P3, Cz Pz, P4, O1, O2, T6, T5, F3 | Cz, O2, F3 |
| COMBI | Fp1, Fp2, Fz, F4, O1, F8, Cz, C4, C3, Pz, P3, T4 | Pz, P4, P3, O1, T4, F3, FP1 | T3, T4, C3, T6, P3, T5, P4, O1, Fp1 | P3, Fp1, Pz, P4, O1, O2, T4, T5, F3 | O1, Fp1, T4 |
| JADE | F3, Fp2, Fz, F4, F3, F8, Cz, C4, C3, Pz, P3, O1, T4 | Pz, P4, P3, O1, T4, F3 | T3, T4, C3, T6, P3, T5, F4, F3, O1, | P3, F4, Pz, P4, O1, O2, T4, T5, F3 | F3, O1, T4 |
Confusion and confidence matrices of the proposed method
| Target | ||||
|---|---|---|---|---|
| Q1 | Q2 | Q3 | Q4 | |
| Decision | ||||
| Q1 | 407 | 8 | 9 | 10 |
| 88.86% | 2.70% | 3.46% | 3.75% | |
| Q2 | 23 | 268 | 8 | 3 |
| 5.02% | 90.54% | 3.07% | 1.12% | |
| Q3 | 17 | 13 | 236 | 5 |
| 3.71% | 4.39% | 90.76% | 1.87% | |
| Q4 | 11 | 7 | 7 | 248 |
| 2.40% | 2.36% | 2.69% | 93.23% | |
The upper value in each cell represents the number of samples correctly classified through the proposed method
Fig. 7Average activation in brain regions in emotions: a Q1, b Q2, Q4, c Q3
Fig. 8Share of activation of each brain region for each class of emotion
A comparison of the provided methods in other papers and the proposed method for Emotion Recognition
| Authors | Year | Method | Classification accuracy (%) |
|---|---|---|---|
| Fan and Chou [ | 2018 | Recurrence quantification analysis, logistic regression | 75.7% |
| Zhong et al. [ | 2017 | Spectral and time features, multiple-fusion-layer based ensemble classifier of stacked autoencoder (MESAE) | 77.19% (arousal accuracy), 76.17% (valence accuracy) |
| Atkinson and Campos [ | 2016 | Statistical and spectral features, Hjorth parameters, fractal dimension, minimum-Redundancy-Maximum-Relevance, support vector machine | 62.39% (valence), 60.72% (arousal) |
| Xu and Plataniotis [ | 2016 | Power spectral density, stacked denoising autoencoders, deep belief network | 85.86% (arousal accuracy of SDAE), 84.77% (valence accuracy of SDAE), 88.33% (arousal accuracy of DBN), 88.59% (valence accuracy of DBN) |
| Jie et al. [ | 2014 | Sample entropy, support vector machine | 79.11% |
| Yin et al. [ | 2017 | Spectral and time features, multiple-fusion-layer based ensemble classifier of stacked autoencoder | 77.19% (arousal accuracy) |
| Tripathi et al. [ | 2017 | Convolutional neural networks, deep neural network | 58.44% (valence, DNN), 55.70% (arousal, DNN), 66.79% (valence, CNN), 57.58% (arousal, CNN) |
| Alam et al. [ | 2016 | Convolutional neural networks | 81.17% |
| Kumar et al. [ | 2016 | Bispectrum, least square support vector machine, radial basis function, linear neural network | 64.86% (arousal), 61.17% (valence) |
| Our work | 2018 | The proposed method | 90.54% |