| Literature DB >> 35665293 |
Li Liu1,2, Yunfeng Ji1, Yun Gao1, Tao Li1, Wei Xu1.
Abstract
With the increasing pressure on college students in terms of study, work, emotion, and life, the emotional changes of college students are becoming more and more obvious. For college student management workers, if they can accurately grasp the emotional state of each college student in all aspects of the whole process, it will be of great help to student management work. The traditional way to understand students' emotions at a certain stage is mostly through chats, questionnaires, and other methods. However, data collection in this way is time-consuming and labor-intensive, and the authenticity of the collected data cannot be guaranteed because students will lie out of impatience or unwillingness to reveal their true emotions. In order to explore an accurate and efficient emotion recognition method for college students, more objective physiological data are used for emotion recognition research. Since emotion is generated by the central nervous system of the human brain, EEG signals directly reflect the electrophysiological activity of the brain. Therefore, in the field of emotion recognition based on physiological signals, EEG signals are favored due to their ability to intuitively respond to emotions. Therefore, a deep neural network (DNN) is used to classify the collected emotional EEG data and obtain the emotional state of college students according to the classification results. Considering that different features can represent different information of the original data, in order to express the original EEG data information as comprehensively as possible, various features of the EEG are first extracted. Second, feature fusion is performed on multiple features using the autosklearn model integration technique. Third, the fused features are input to the DNN, resulting in the final classification result. The experimental results show that the method has certain advantages in public datasets, and the accuracy of emotion recognition exceeds 88%. This proves the used emotion recognition is feasible to be applied in real life.Entities:
Mesh:
Year: 2022 PMID: 35665293 PMCID: PMC9162810 DOI: 10.1155/2022/1343358
Source DB: PubMed Journal: Comput Intell Neurosci
EEG features.
| Domain | Feature |
|---|---|
| Time domain | Skewness, kurtosis, zero-crossing rate, instability index, Hurst's index |
| Detrended fluctuation analysis, Pearson's fractal dimension, sample entropy, | |
| HFD, Hjorth activity, hjorth mobility, hjorth complexity | |
| Energy, RMS, vector autoregression (VAR) | |
|
| |
| Frequency domain | Power, wavelet entropy, spectral entropy, PSD, partial directed coherence (PDC) |
| Band power (BP) | |
|
| |
| Spatial domain | Index of asymmetry, complex network (CN) |
Figure 1Emotion recognition architecture diagram.
Figure 2Feature sorting.
Description of experimental software and hardware environment.
| Name | Details | Name | Details |
|---|---|---|---|
| CPU | I9 9900 K | Editor | PyCharm COMMUNITY 2018.3 |
| RAM | 32G DDR4 3200 MHz | Locales | Python 3.6 |
| GPU | GTX 1070 | Deep learning framework | Tensorflow 1.12 |
| Graphics card | NVIDIA GeForce RTX2080 | Operating system | Windows10 |
Experimental results of each model on different emotional dimensions.
| Model | Index | Valence | Arousal | Liking |
|---|---|---|---|---|
| SVM | Accuracy | 0.6587 | 0.5882 | 0.5643 |
| Precision | 0.6174 | 0.5612 | 0.5144 | |
|
| 0.5693 | 0.5610 | 0.4757 | |
|
| ||||
| CNN | Accuracy | 0.7596 | 0.6716 | 0.6685 |
| Precision | 0.7396 | 0.6586 | 0.6285 | |
|
| 0.7264 | 0.6659 | 0.6217 | |
|
| ||||
| DNN | Accuracy | 0.8773 | 0.8606 | 0.8457 |
| Precision | 0.8338 | 0.8316 | 0.8182 | |
|
| 0.8163 | 0.8573 | 0.8383 | |
|
| ||||
| RNN | Accuracy | 0.7196 | 0.7016 | 0.6985 |
| Precision | 0.7047 | 0.6644 | 0.6891 | |
|
| 0.7160 | 0.6254 | 0.6850 | |
|
| ||||
| LSTM | Accuracy | 0.8924 | 0.8841 | 0.8572 |
| Precision | 0.8501 | 0.8744 | 0.8459 | |
|
| 0.8346 | 0.8282 | 0.8244 | |
|
| ||||
| Reference [ | Accuracy | 0.7288 | 0.7469 | 0.7475 |
| Precision | 0.7069 | 0.7413 | 0.7346 | |
|
| 0.6767 | 0.7769 | 0.7235 | |
|
| ||||
| Reference [ | Accuracy | 0.7697 | 0.7830 | 0.7550 |
| Precision | 0.7367 | 0.7571 | 0.7064 | |
|
| 0.7583 | 0.7983 | 0.7022 | |
|
| ||||
| Proposed | Accuracy | 0.9078 | 0.9002 | 0.8865 |
| Precision | 0.8926 | 0.8841 | 0.8677 | |
|
| 0.8903 | 0.8782 | 0.8559 | |
Figure 3Comparison of experimental results.
Figure 4Loss and accuracy of training and test sets in Valence dimension.
Figure 5Loss and accuracy of training and test sets in the Arousal dimension.