| Literature DB >> 34956582 |
Jing Xue1.
Abstract
In order to improve the classification accuracy and reliability of emotional state assessment and provide support and help for music therapy, this paper proposes an EEG analysis method based on wavelet transform under the stimulation of music perception. Using the data from the multichannel standard emotion database (DEAP), α, ß, and θ rhythms are extracted in frontal (F3 and F4), temporal (T7 and T8), and central (C3 and C4) channels with wavelet transform. EMD is performed on the extracted EEG rhythm to obtain intrinsic mode function (IMF) components, and then, the average energy and amplitude difference eigenvalues of IMF components of EEG rhythm waves are further extracted, that is, each rhythm wave contains three average energy characteristics and two amplitude difference eigenvalues so as to fully extract EEG feature information. Finally, emotional state evaluation is realized based on a support vector machine classifier. The results show that the correct rate between no emotion, positive emotion, and negative emotion can reach more than 90%. Among the pairwise classification problems among the four emotions selected, the classification accuracy obtained by this EEG feature extraction method is higher than that obtained by general feature extraction methods, which can reach about 70%. Changes in EEG α wave power were closely correlated with the polarity and intensity of emotion; α wave power varied significantly between "happiness and fear," "pleasure and fear," and "fear and sadness." It has a good application prospect in both psychological and physiological research of emotional perception and practical application.Entities:
Mesh:
Year: 2021 PMID: 34956582 PMCID: PMC8694970 DOI: 10.1155/2021/9725762
Source DB: PubMed Journal: J Healthc Eng ISSN: 2040-2295 Impact factor: 2.682
Figure 1Algorithm flow chart.
Figure 2Three rhythmic waves of reconstructed EEG.
Figure 3IMF component spectrogram with determined order.
Classification of no emotion and positive emotion.
| Band | Optimal accuracy rate (%) | Accuracy of test set (%) |
|---|---|---|
|
| 93.75 | 84.38 |
|
| 96.88 | 81.25 |
|
| 93.75 | 90.63 |
|
| 100.00 | 100.00 |
|
| 100.00 | 87.50 |
|
| 100.00 | 84.38 |
Classification of no emotion and negative emotion.
| Band | Optimal accuracy rate (%) | Accuracy of test set (%) |
|---|---|---|
|
| 96.75 | 93.75 |
|
| 93.88 | 87.50 |
|
| 100.00 | 93.75 |
|
| 100.00 | 100.00 |
|
| 100.00 | 96.88 |
|
| 100.00 | 100.00 |
Pairwise classification between four emotions.
| Classify | Approximate entropy + wavelet entropy accuracy (%) | Average energy + amplitude difference (this algorithm) accuracy (%) |
|---|---|---|
| H-E | 65.63 | 68.75 |
| H-S | 68.75 | 81.25 |
| E-S | 50.00 | 75.00 |
| H-T | 59.38 | 71.88 |
| E-T | 59.38 | 78.13 |
| S-T | 53.13 | 65.63 |