| Literature DB >> 31549331 |
Jianzhuo Yan1,2,3, Shangbin Chen4,5,6, Sinuo Deng1,2,3.
Abstract
As an advanced function of the human brain, emotion has a significant influence on human studies, works, and other aspects of life. Artificial Intelligence has played an important role in recognizing human emotion correctly. EEG-based emotion recognition (ER), one application of Brain Computer Interface (BCI), is becoming more popular in recent years. However, due to the ambiguity of human emotions and the complexity of EEG signals, the EEG-ER system which can recognize emotions with high accuracy is not easy to achieve. Based on the time scale, this paper chooses the recurrent neural network as the breakthrough point of the screening model. According to the rhythmic characteristics and temporal memory characteristics of EEG, this research proposes a Rhythmic Time EEG Emotion Recognition Model (RT-ERM) based on the valence and arousal of Long-Short-Term Memory Network (LSTM). By applying this model, the classification results of different rhythms and time scales are different. The optimal rhythm and time scale of the RT-ERM model are obtained through the results of the classification accuracy of different rhythms and different time scales. Then, the classification of emotional EEG is carried out by the best time scales corresponding to different rhythms. Finally, by comparing with other existing emotional EEG classification methods, it is found that the rhythm and time scale of the model can contribute to the accuracy of RT-ERM.Entities:
Keywords: EEG; Emotion recognition; LSTM; Rhythm and time characteristics
Year: 2019 PMID: 31549331 PMCID: PMC6757079 DOI: 10.1186/s40708-019-0100-y
Source DB: PubMed Journal: Brain Inform ISSN: 2198-4026
Fig. 1Block diagram of window segmentation
Fig. 2An emotion recognition model inspired by “rhythm–time” characteristic
The classification results of RT-ERM with different time scales for θ rhythm under the dimension of emotion valence
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 56.35 ± 2.4113 | 52.64 ± 9.4194 | 59.99 ± 11.5745 | 54.9165 |
| 0.5 | 59.5 ± 3.5681 | 59.48 ± 10.2801 | 59.48 ± 11.3018 | 59.9822 |
| 0.75 | 58.97 ± 4.8805 | 57.9 ± 7.0734 | 59.99 ± 5.8587 | 58.5105 |
| 1.0 | 58.44 ± 3.6729 | 58.43 ± 10.1237 | 58.44 ± 9.2477 | 58.5264 |
| 2.0 | 61.59 ± 4.8816 | 60.0 ± 9.7751 | 63.17 ± 7.4529 | 60.9783 |
| 3.0 | 59.49 ± 3.1551 | 56.85 ± 8.7401 | 62.11 ± 9.9121 | 58.6328 |
| 4.0 | 59.49 ± 5.5365 | 57.9 ± 9.7185 | 61.07 ± 13.3576 | 59.2665 |
| 5.0 | 58.7 ± 3.9015 | 61.59 ± 7.0814 | 55.77 ± 6.7373 | 59.9094 |
| 6.0 | 58.7 ± 3.9015 | 61.06 ± 8.5573 | 56.33 ± 7.8225 | 59.7188 |
The classification results of RT-ERM with different time scales for α rhythm under the dimension of emotion valence
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 60.27 ± 4.1427 | 61.57 ± 11.2808 | 58.95 ± 10.4674 | 61.0404 |
| 0.5 | 58.17 ± 3.1975 | 56.85 ± 6.5733 | 59.48 ± 8.5147 | 57.7809 |
| 0.75 | 59.76 ± 4.4014 | 59.47 ± 8.4918 | 60.01 ± 7.5027 | 59.6736 |
| 1.0 | 58.43 ± 3.8471 | 60.0 ± 10.3032 | 56.84 ± 11.7183 | 59.3884 |
| 2.0 | 60.53 ± 5.1299 | 59.47 ± 7.0977 | 61.59 ± 10.7938 | 60.3727 |
| 3.0 | 58.97 ± 3.7542 | 58.96 ± 9.3483 | 58.95 ± 9.3551 | 59.0652 |
| 4.0 | 58.17 ± 2.9786 | 61.05 ± 8.5364 | 55.27 ± 8.2255 | 59.4217 |
| 5.0 | 59.22 ± 8.087 | 63.16 ± 14.6947 | 55.26 ± 9.1902 | 60.5821 |
| 6.0 | 61.06 ± 3.4886 | 59.99 ± 10.3256 | 62.1 ± 9.6398 | 60.8203 |
The classification results of RT-ERM with different time scales for β rhythm under the dimension of emotion valence
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 60.29 ± 4.7628 | 58.94 ± 8.086 | 61.57 ± 5.7827 | 59.6931 |
| 0.5 | 57.91 ± 2.0398 | 54.75 ± 6.7467 | 61.06 ± 6.3283 | 56.5961 |
| 0.75 | 62.12 ± 5.7946 | 66.85 ± 9.1329 | 56.31 ± 11.7915 | 63.7077 |
| 1.0 | 59.47 ± 2.6751 | 58.42 ± 7.9748 | 60.52 ± 5.8853 | 59.0577 |
| 2.0 | 60.02 ± 3.4931 | 58.94 ± 9.3506 | 61.05 ± 5.8641 | 59.5255 |
| 3.0 | 58.18 ± 2.9866 | 56.32 ± 7.4668 | 60.0 ± 9.4722 | 57.6223 |
| 4.0 | 61.07 ± 6.5296 | 58.99 ± 13.0313 | 63.16 ± 9.7131 | 60.1829 |
| 5.0 | 59.48 ± 3.3671 | 51.62 ± 9.3297 | 67.38 ± 7.7268 | 56.0941 |
| 6.0 | 57.79 ± 2.4043 | 52.31 ± 9.2681 | 63.28 ± 7.6005 | 55.4205 |
The classification results of RT-ERM with different time scales for γ rhythm under the dimension of emotion valence
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 58.17 ± 3.1975 | 57.37 ± 7.9609 | 58.95 ± 8.0782 | 57.9163 |
| 0.5 | 58.18 ± 4.1395 | 61.58 ± 9.7172 | 54.75 ± 9.1824 | 59.5898 |
| 0.75 | 58.69 ± 2.359 | 56.3 ± 9.72049 | 61.06 ± 8.8617 | 57.7672 |
| 1.0 | 59.23 ± 3.1682 | 59.49 ± 5.7936 | 58.94 ± 7.3657 | 59.441 |
| 2.0 | 60.54 ± 4.2358 | 59.47 ± 12.0169 | 61.58 ± 11.7651 | 60.358 |
| 3.0 | 59.48 ± 2.9294 | 60.0 ± 7.8775 | 58.96 ± 9.0603 | 59.8546 |
| 4.0 | 59.48 ± 5.7782 | 60.02 ± 6.7483 | 58.95 ± 8.0782 | 59.7818 |
| 5.0 | 60.52 ± 4.7069 | 60.53 ± 4.8538 | 60.54 ± 11.1021 | 60.9008 |
| 6.0 | 58.96 ± 2.6829 | 58.41 ± 10.1091 | 59.49 ± 7.0932 | 58.7311 |
The classification results of RT-ERM with different time scales for θ rhythm under the dimension of emotion arousal
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 67.0 ± 7.3143 | 60.5 ± 10.8282 | 73.5 ± 13.4257 | 65.5009 |
| 0.5 | 69.1 ± 4.2131 | 65.5 ± 10.3561 | 72.5 ± 8.1394 | 67.9658 |
| 0.75 | 62.25 ± 2.8394 | 55.0 ± 5.0 | 69.5 ± 4.7169 | 59.3537 |
| 1.0 | 64.25 ± 3.7165 | 63.5 ± 8.6746 | 65.0 ± 10.9544 | 63.7584 |
| 2.0 | 64.57 ± 1.991 | 63.41 ± 6.3994 | 65.73 ± 7.8801 | 64.3596 |
| 3.0 | 61.0 ± 5.3851 | 62.5 ± 11.8848 | 59.5 ± 8.2006 | 61.5251 |
| 4.0 | 57.75 ± 4.9307 | 56.0 ± 9.9498 | 59.5 ± 10.1118 | 57.18 |
| 5.0 | 61.0 ± 2.7838 | 61.0 ± 9.1651 | 61.0 ± 7.0 | 61.0651 |
| 6.0 | 62.5 ± 4.8734 | 61.5 ± 8.6746 | 63.5 ± 6.7268 | 62.1434 |
The classification results of RT-ERM with different time scales for α rhythm under the dimension of emotion arousal
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 63.75 ± 3.4003 | 60.5 ± 6.8738 | 67.0 ± 8.42615 | 62.7233 |
| 0.5 | 58.25 ± 6.8965 | 59.0 ± 13.0 | 57.5 ± 12.2983 | 58.7429 |
| 0.75 | 60.25 ± 4.8023 | 59.0 ± 8.0 | 61.5 ± 8.6746 | 59.8748 |
| 1.0 | 58.24 ± 4.8916 | 56 ± 20.3469 | 55.5 ± 7.8898 | 54.2846 |
| 2.0 | 60.75 ± 2.5124 | 60.0 ± 8.3666 | 61.5 ± 7.433 | 60.552 |
| 3.0 | 56.25 ± 2.3048 | 56.5 ± 12.6589 | 56.0 ± 12.0 | 56.5154 |
| 4.0 | 59.5 ± 2.4494 | 61.0 ± 9.4339 | 58 ± 8.7178 | 60.2175 |
| 5.0 | 58.0 ± 4.4441 | 57.5 ± 8.13941 | 58.5 ± 8.6746 | 57.8832 |
| 6.0 | 60.25 ± 4.5345 | 60.0 ± 10.7238 | 60.5 ± 12.7377 | 60.5214 |
The classification results of RT-ERM with different time scales for β rhythm under the dimension of emotion arousal
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 58.75 ± 3.5794 | 61.0 ± 8.6023 | 56.5 ± 9.7596 | 59.832 |
| 0.5 | 60.75 ± 4.8798 | 68.0 ± 7.1414 | 53.5 ± 12.4599 | 63.733 |
| 0.75 | 63.5 ± 3.0 | 64.0 ± 8.6023 | 63.0 ± 8.124 | 63.8073 |
| 1.0 | 63.0 ± 4.8476 | 65.5 ± 10.5948 | 60.5 ± 8.7891 | 64.0 |
| 2.0 | 59.0 ± 3.3911 | 57.5 ± 10.0623 | 60.5 ± 11.9268 | 58.6477 |
| 3.0 | 56.25 ± 5.6181 | 56.5 ± 6.3442 | 56.0 ± 9.9498 | 56.5601 |
| 4.0 | 58.5 ± 5.0249 | 58.5 ± 8.9582 | 58.5 ± 10.7354 | 58.788 |
| 5.0 | 58.25 ± 3.3634 | 59.0 ± 5.3851 | 57.5 ± 4.6097 | 58.566 |
| 6.0 | 59.75 ± 3.4369 | 57.0 ± 7.1414 | 62.0 ± 9.0 | 58.6327 |
The classification results of RT-ERM with different time scales for γ rhythm under the dimension of emotion arousal
| Time scale (s) | Assessment method (mean ± std.) | |||
|---|---|---|---|---|
| ACC/% | TPR/% | TNR/% | Macro-F1 | |
| 0.25 | 61.25 ± 3.9131 | 62.5 ± 5.1234 | 60.0 ± 7.7459 | 61.8742 |
| 0.5 | 60.0 ± 5.1234 | 59.5 ± 8.7891 | 60.5 ± 8.7891 | 59.8828 |
| 0.75 | 58.75 ± 2.7951 | 61.5 ± 8.6746 | 56.0 ± 10.9087 | 60.0983 |
| 1.0 | 59.5 ± 1.8708 | 64.0 ± 7.6811 | 55.0 ± 7.0711 | 61.3011 |
| 2.0 | 61.5 ± 5.0249 | 60.5 ± 9.6046 | 62.5 ± 8.1394 | 61.17 |
| 3.0 | 59.5 ± 1.5 | 64.0 ± 8.8881 | 56.0 ± 11.5758 | 61.8743 |
| 4.0 | 59.25 ± 5.25 | 57.5 ± 6.0207 | 61.0 ± 6.6332 | 58.5626 |
| 5.0 | 59.5 ± 4.4441 | 58.0 ± 10.2956 | 61.0 ± 7.0 | 58.8468 |
| 6.0 | 57.5 ± 2.958 | 52.5 ± 7.1589 | 62.5 ± 6.0207 | 55.2748 |
Comparison of results that use EEG signals of DEAP dataset for emotion recognition
| Literature | Emotion category | Window’s length | Classification | The highest classification accuracy (Acc/%) |
|---|---|---|---|---|
| Rozgić et al. [ | Arousal/2 Valence/2 | 1 s/2 s/4 s/8 s (1-s step length) | SVM KNN | 68.4/2 76.9/2 |
| Zhuang et al. [ | Arousal/2 Valence/2 | 1 s (0.1-s step length) | SVR | 68.4/2 76.9/2 |
| Yoon et al. [ | Arousal/2 Valence/2 | 2 s (1-s step length) | Bayesian based on sensor convergence | 70.1/2 70.9/2 |
| Hatamikia et al. [ | Arousal/2 | 1 s | KNN, QDA, LDA | 74.2/2 72.33/2 |
| Valence/2 | ||||
| Tripathi et al. [ | Arousal/2 Valence/2 | – | DNN | 73.28/2 75.58/2 |
| Li et al. [ | (Arousal and Valence)/2 | 3 s | SAE, LSTM RNN | 79.26/2 |
| Kuai et al. [ | Arousal/2 Valence/2 | 3 s | RSP-ERM | 64/2 66.6/2 |
| Our work | Arousal/2 | < 1 s | RT-REM | 69.1/2 |
| Valence/2 | 62.12/2 |
SVR support vector regression, QDA quadratic discriminant analysis, LDA linear discriminant analysis, RSP-ERM emotional recognition model based on rhythm synchronization patterns, /2 binary classification