| Literature DB >> 35401346 |
Yanjing Shi1,2, Xiangwei Zheng1, Min Zhang1, Xiaoyan Yan3, Tiantian Li4, Xiaomei Yu1.
Abstract
Electroencephalogram (EEG) has been widely utilized in emotion recognition. Psychologists have found that emotions can be divided into conscious emotion and unconscious emotion. In this article, we explore to classify subliminal emotions (happiness and anger) with EEG signals elicited by subliminal face stimulation, that is to select appropriate features to classify subliminal emotions. First, multi-scale sample entropy (MSpEn), wavelet packet energy (E i ), and wavelet packet entropy (WpEn) of EEG signals are extracted. Then, these features are fed into the decision tree and improved random forest, respectively. The classification accuracy with E i and WpEn is higher than MSpEn, which shows that E i and WpEn can be used as effective features to classify subliminal emotions. We compared the classification results of different features combined with the decision tree algorithm and the improved random forest algorithm. The experimental results indicate that the improved random forest algorithm attains the best classification accuracy for subliminal emotions. Finally, subliminal emotions and physiological proof of subliminal affective priming effect are discussed.Entities:
Keywords: EEG; feature extraction; improved random forest; subliminal emotion; subliminal emotion classification
Year: 2022 PMID: 35401346 PMCID: PMC8989849 DOI: 10.3389/fpsyg.2022.781448
Source DB: PubMed Journal: Front Psychol ISSN: 1664-1078
Figure 1The process of subliminal emotion classification.
Figure 2Energy ratio of 4-layer wavelet packet decomposition.
The 4-layer wavelet packet decomposition frequency intervals and energy ratio.
|
|
|
|
|---|---|---|
| (4,0) | 87.3802% | 0 ~ 16 |
| (4,1) | 5.9086% | 16 ~ 32 |
| (4,2) | 3.1553% | 32 ~ 48 |
| (4,3) | 2.004% | 48 ~ 64 |
Figure 3The principle of improved random forest algorithm.
Average results of decision tree algorithm with multi-scale sample entropy (MSpEn), wavelet packet entropy (WpEn), and wavelet packet energy (E).
|
|
|
|
|
|
|
|
|---|---|---|---|---|---|---|
| MSpEn | 80.25% | 73.57% | 72.15% | 53.65% | 83.52% | 79.80% |
|
|
|
| 96.77% |
|
|
|
| WpEn | 96.97% | 83.27% |
| 95.30% | 91.40% | 88.68% |
|
|
|
|
|
|
|
|
| MSpEn | 79.80% | 75.80% | 78.60% | 84.50% | 73.40% | 75.52% |
|
|
| 89.07% |
|
|
|
|
| WpEn | 88.68% |
| 94.08% | 97.55% | 97.39% | 93.32% |
Figure 4Comparison of classification accuracy with three features and decision tree classifier.
Average results of random forest algorithm with MSpEn, WpEn, and E.
|
|
|
|
|
|
|
|
|---|---|---|---|---|---|---|
| MSpEn | 87.65% | 85% | 86.25% | 85% | 91.25% | 96.25% |
|
|
|
|
|
|
|
|
| WpEn | 96.25% | 86.25% |
| 90% | 95% | 95% |
|
|
|
|
|
|
|
|
| MSpEn | 96.25% | 88.75% | 86.25% | 93.75% | 88.75% | 88.89% |
|
|
| 96.25% |
|
|
|
|
| sWpEn | 95% |
| 91.25% | 93.75% | 96.25% | 93.38% |
Figure 5Comparison of classification accuracy with three features and improved fandom forest.
Figure 6Comparison of classification accuracy of two classifiers based on MSpEn.
Figure 8Comparison of classification accuracy of two classifiers based on WpEn.
Figure 7Comparison of classification accuracy of two classifiers based on E.
Figure 9Comparison of the average classification results with other classifiers.