| Literature DB >> 30073024 |
Wei Wei1, Qingxuan Jia1, Yongli Feng1, Gang Chen1.
Abstract
Emotion recognition is an important pattern recognition problem that has inspired researchers for several areas. Various data from humans for emotion recognition have been developed, including visual, audio, and physiological signals data. This paper proposes a decision-level weight fusion strategy for emotion recognition in multichannel physiological signals. Firstly, we selected four kinds of physiological signals, including Electroencephalography (EEG), Electrocardiogram (ECG), Respiration Amplitude (RA), and Galvanic Skin Response (GSR). And various analysis domains have been used in physiological emotion features extraction. Secondly, we adopt feedback strategy for weight definition, according to recognition rate of each emotion of each physiological signal based on Support Vector Machine (SVM) classifier independently. Finally, we introduce weight in decision level by linear fusing weight matrix with classification result of each SVM classifier. The experiments on the MAHNOB-HCI database show the highest accuracy. The results also provide evidence and suggest a way for further developing a more specialized emotion recognition system based on multichannel data using weight fusion strategy.Entities:
Mesh:
Year: 2018 PMID: 30073024 PMCID: PMC6057426 DOI: 10.1155/2018/5296523
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1EEG electrode locations.
Figure 2Typical structure of ECG signal.
Figure 3Flow of weight fusion strategy.
Figure 4Flow of the emotional recognition based multichannel physiological signal.
MAHNOB-HCI database recorded signals.
| Emotion Data Modalities |
|---|
|
|
|
|
|
|
|
|
| 1-channel Skin Temperature (SKT) (256 Hz) |
| Face and Body Video (6 cameras, 60f/s) |
| Eye Gaze (60 Hz) |
| Audio (44.1 kHz) |
MAHNOB-HCI database recorded emotions and the corresponding labels.
|
|
|
|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Amusement |
|
| Anger |
|
|
|
|
| Surprise |
|
| Anxiety |
The size of each set of each emotion.
|
|
|
|
|
|---|---|---|---|
|
| 69 | 31 | 38 |
|
| 86 | 31 | 55 |
|
| 57 | 31 | 26 |
|
| 112 | 31 | 81 |
|
| 39 | 31 | 8 |
The detailed number of correctly recognized data and recognition rate under various physiological signals.
|
|
|
|
|
|
| |
|---|---|---|---|---|---|---|
|
| 38 | 55 | 26 | 81 | 8 | 208 |
|
| ||||||
|
| 25 | 42 | 13 | 70 | 5 | 155 |
|
|
| 50.00% |
|
|
| |
|
| ||||||
|
| 21 | 37 | 18 | 63 | 4 | 143 |
| 55.26% | 67.27% |
| 77.78% | 50.00% | 68.75% | |
|
| ||||||
|
| 17 | 30 | 12 | 51 | 3 | 113 |
| 44.74% | 54.55% | 46.15% | 62.96% | 37.50% | 54.33% | |
|
| ||||||
|
| 18 | 29 | 13 | 56 | 4 | 120 |
| 47.37% | 52.73% | 50.00% | 69.14% | 50.00% | 57.69% | |
Emotion expression ordering of each physiological signal.
|
| |
|---|---|
|
| Neutral > Happiness > Sadness > Fear > Disgust |
|
| Neutral > Disgust >Happiness > Sadness > Fear |
|
| Neutral > Happiness > Disgust > Sadness > Fear |
|
| Neutral > Happiness > Disgust = Fear > Sadness |
The detailed number of correctly recognized data and recognition rate under two situations of weight matrix.
|
|
|
|
|
|
| |
|---|---|---|---|---|---|---|
|
| 38 | 55 | 26 | 81 | 8 | 208 |
|
| ||||||
|
| 25 | 43 | 18 | 72 | 5 | 155 |
| 65.79% | 78.18% | 69.23% | 88.89% | 62.50% | 74.52% | |
|
| ||||||
|
| 28 | 47 | 20 | 75 | 6 | 176 |
|
|
|
|
|
|
| |