| Literature DB >> 32575894 |
Gisela Pinto1, João M Carvalho1,2, Filipa Barros3,4,5, Sandra C Soares3,4,5, Armando J Pinho1,2, Susana Brás1,2.
Abstract
Emotional responses are associated with distinct body alterations and are crucial to foster adaptive responses, well-being, and survival. Emotion identification may improve peoples' emotion regulation strategies and interaction with multiple life contexts. Several studies have investigated emotion classification systems, but most of them are based on the analysis of only one, a few, or isolated physiological signals. Understanding how informative the individual signals are and how their combination works would allow to develop more cost-effective, informative, and objective systems for emotion detection, processing, and interpretation. In the present work, electrocardiogram, electromyogram, and electrodermal activity were processed in order to find a physiological model of emotions. Both a unimodal and a multimodal approach were used to analyze what signal, or combination of signals, may better describe an emotional response, using a sample of 55 healthy subjects. The method was divided in: (1) signal preprocessing; (2) feature extraction; (3) classification using random forest and neural networks. Results suggest that the electrocardiogram (ECG) signal is the most effective for emotion classification. Yet, the combination of all signals provides the best emotion identification performance, with all signals providing crucial information for the system. This physiological model of emotions has important research and clinical implications, by providing valuable information about the value and weight of physiological signals for emotional classification, which can critically drive effective evaluation, monitoring and intervention, regarding emotional processing and regulation, considering multiple contexts.Entities:
Keywords: affective computing; feature extraction; multimodal; neural network; random forest
Mesh:
Year: 2020 PMID: 32575894 PMCID: PMC7349550 DOI: 10.3390/s20123510
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Data collection schema.
Figure 2Process for time series synchronization.
Extracted features from collected biosignals: electrocardiogram (ECG), electromyogram (EMG), electrodermal activity (EDA).
| Signal | Features Extracted |
|---|---|
|
| R Peaks, Cardiac Cycles, T Waves, P Waves, Q Waves, Cardiac Cycles Signal Quality, Average Signal Quality, ECG Signal Quality, ECG Raw, ECG Filtered, Heart Rate, ECG Systole, RR Interval, Heart Rate Variability high frequency (ECG HRV HF), Heart Rate Variability low frequency (ECG HRV LF), Heart Rate Variability Ultra Low Frequency (ECG HRV ULF), Heart Rate Variability Very High Frequency (ECG HRV VHF), Detrended Fluctuation Analysis (DFA) 1, Detrended Fluctuation Analysis 2, Shannon, Sample Entropy, Heart Rate Variability Very Low Frequency (ECG HRV VLF), Electrocardiographic Artifacts, Root Mean Square of Successive Differences (RMSSD), mean of distance between Normal to Normal peaks (meanNN), standard deviation of distance between Normal to Normal peaks (sdNN), coefficient of variation of distance between Normal to Normal peaks (cvNN), coefficient of variation of sdNN(CVSD), median of distance between Normal to Normal peaks (medianNN), mean absolute deviation of distance between Normal to Normal peaks (madNN), Median-based Coefficient of Variation (mcvNN), percentage of NN intervals differing by more than 50 ms (pNN50), pNN20, Entropy Multiscale Area Under the Curve (AUC), Entropy SVD, Entropy Spectral VLF, Triang, Shannon h, Ultra Low Frequency (ULF), Very Low Frequency (VLF), Low Frequency (LF), High Frequency (HF), Very High Frequency (VHF), Correlation Dimension, Entropy Spectral LF, Entropy Spectral HF, Fisher Info, FD Petrosian, Total Power, LFn, HFn, LF/HF, LF/P, HF/P, FD Higushi |
|
| EDA Raw, EDA Filtered, EDA Phasic, EDA Tonic, Skin Conductance Response (SCR) Recoveries, SCR Peaks, SCR Recovery Indexes, SCR Peaks Amplitudes, SCR Onsets, SCR Peaks Indexes |
|
| EMG Raw, EMG Filtered, EMG Envelope, EMG Activation, EMG Pulse Onsets |
Selected features for emotional classification, from electrocardiogram (ECG), electrodermal activity (EDA), and electromyogram (EMG).
| Signal | Selected Feature |
|---|---|
|
| Heart Rate, ECG RR Intervals, Heart Rate Variability High Frequency, Heart Rate Variability Low Frequency, Heart Rate Variability Ultra Low Frequency, T Waves |
|
| EDA Tonic, Skin Conductance Response (SCR) Peaks Indexes |
|
| EMG Envelope, EMG Pulse Onsets |
Figure 3Representation of T Wave time series (at right), and the corresponding box plot (at left).
Figure 4Multimodal emotion classification workflow.
F1-score obtained for the two classifiers; (a) corresponds to the subject independent evaluation, (b) corresponds to the subject dependent evaluation, and (c) corresponds to emotion dependent evaluation.
| Random Forest (%) | Neural Network (%) | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Emotion | Condition | Neutral | Fear | Happy | Neutral | Fear | Happy | ||||||
| 30s | 60s | 30s | 60s | 30s | 60s | 30s | 60s | 30s | 60s | 30s | 60s | ||
|
|
| 39 | 42 | 53 | 53 | 49 | 54 | 50 | 48 | 43 | 59 | 39 | 48 |
|
| 50 | 24 | 50 | 40 | 52 | 45 | 32 | 54 | 52 | 49 | 52 | 33 | |
|
| 89 | 80 | 89 | 85 | 86 | 82 | 78 | 58 | 82 | 57 | 73 | 47 | |
|
|
| 46 | 44 | 27 | 38 | 44 | 48 | 49 | 44 | 38 | 45 | 50 | 49 |
|
| 53 | 38 | 42 | 36 | 27 | 51 | 43 | 60 | 30 | 29 | 61 | 50 | |
|
| 71 | 62 | 57 | 61 | 63 | 60 | 34 | 61 | 56 | 61 | 68 | 44 | |
|
|
| 39 | 41 | 32 | 37 | 41 | 45 | 44 | 41 | 46 | 50 | 17 | 23 |
|
| 47 | 51 | 49 | 47 | 48 | 51 | 58 | 54 | 25 | 22 | 29 | 26 | |
|
| 48 | 51 | 48 | 47 | 56 | 55 | 58 | 19 | 56 | 57 | 1 | 29 | |
|
|
| 34 | 35 | 40 | 36 | 35 | 39 | 41 | 44 | 38 | 32 | 44 | 41 |
|
| 44 | 30 | 39 | 46 | 41 | 42 | 61 | 68 | 40 | 17 | 66 | 65 | |
|
| 39 | 37 | 46 | 49 | 44 | 47 | 35 | 56 | 55 | 41 | 80 | 50 | |
|
|
| 35 | 36 | 42 | 38 | 46 | 43 | 49 | 48 | 43 | 32 | 28 | 38 |
|
| 44 | 43 | 46 | 48 | 45 | 49 | 57 | 56 | 9 | 51 | 44 | 34 | |
|
| 51 | 57 | 63 | 52 | 71 | 61 | 61 | 55 | 26 | 47 | 74 | 46 | |
|
|
| 42 | 38 | 42 | 49 | 46 | 35 | 29 | 40 | 51 | 62 | 47 | 33 |
|
| 46 | 34 | 48 | 47 | 47 | 44 | 47 | 57 | 67 | 64 | 83 | 77 | |
|
| 61 | 52 | 61 | 56 | 52 | 54 | 53 | 48 | 73 | 63 | 60 | 63 | |
Figure 5Sensitivity and specificity obtained for the two classifiers in different condition; (a) corresponds to the subject independent evaluation, (b) corresponds to the subject dependent evaluation, and (c) corresponds to emotion dependent evaluation.
Global result for 30s frame. The mean (standard deviation) [Maximum Minimum] F1 score obtained for the 30s classifiers in each iteration; (a) corresponds to the subject independent evaluation, (b) corresponds to the subject dependent evaluation, and (c) corresponds to emotion dependent evaluation.
| Signal | Condition | Random Forest (Mean % ± std %) | Neural Network (Mean % ± std %) |
|---|---|---|---|
|
|
| 39.07 (1.61) [Max: 43.64%, Min: 37.94%] | 35.82 (1.05) [Max: 38.32%, Min: 34.37%] |
|
| 30.93 (1.40) [Max: 35.47%, Min: 29.94%] | 31.00 (3.02) [Max: 31.74%, Min: 29.53%] | |
|
| 76.25 (1.42) [Max: 77.70%, Min: 73.13%] | 50.16 (3.41) [Max: 52.11%, Min: 40.14%] | |
|
|
| 36 (0.78) [Max: 36.73%, Min: 34.25%] | 37 (0.77) [Max: 38.87%, Min: 36.10%] |
|
| 32.01 (0.83) [Max: 33.21% Min: 30.31%] | 34.63 (1.16) [Max: 35.18% Min: 30.11%] | |
|
| 49.79 (3.35) [Max: 58.10%, Min: 46.62%] | 37.43 (0.86) [Max: 38.24%, Min: 35.47%] | |
|
|
| 34.23 (0.20) [Max: 34.53%, Min: 33.82%] | 33.75 (1.05) [Max: 38.30%, Min: 34.87%] |
|
| 31.60 (0.96) [Max: 32.74% Min: 30.08%] | 31.12(0.88) [Max: 34.01% Min: 30.97%] | |
|
| 38.33 (1.66) [Max: 39.80%, Min: 34.95%] | 36.69 (1.40) [Max: 39.70%, Min: 34.69%] | |
|
|
| 33.82 (0.29) [Max: 34.33%, Min: 33.32%] | 33 (0.27) [Max: 34.41%, Min: 33.46%] |
|
| 32.73 (0.43) [Max: 33.37%, Min: 31.68%] | 35.90 (2.88) [Max: 39.97%, Min: 29.48%] | |
|
| 39.46 (1.16) [Max: 41.65%, Min: 38.17%] | 41.18 (5.23) [Max: 51.98%, Min: 37.05%] | |
|
|
| 35.80 (0.43) [Max: 36.33%, Min: 35.03%] | 36.00 (0.63) [Max: 37.04%, Min: 35.20%] |
|
| 35.28 (1.08) [Max: 36.73% Min: 33.53%] | 27.67 (1.49) [Max: 29.49% Min: 24.85%] | |
|
| 49.01 (1.62) [Max: 51.26%, Min: 45.90%] | 36.02 (2.08) [Max: 37.85%, Min: 30.04%] | |
|
|
| 37.53 (1.32) [Max: 38.78%, Min: 34.25%] | 38.98 (0.64) [Max: 39.90%, Min: 37.82%] |
|
| 36.23 (1.59) [Max: 40.12% Min: 34.93%] | 42.32 (4.46) [Max: 53.90% Min: 39.03%] | |
|
| 48.79 (2.13) [Max: 51.72%, Min: 43.11%] | 42.77 (3.40) [Max: 46.75%, Min: 33.82%] |
Global results for 60s frame. The mean (standard deviation) [Maximum Minimum] F1 score obtained for the 60s classifiers in each iteration; (a) corresponds to the subject independent evaluation, (b) corresponds to the subject dependent evaluation, and (c) corresponds to emotion dependent evaluation.
| Signal | Condition | Random Forest (Mean % ± std %) | Neural Network (Mean % ± std %) |
|---|---|---|---|
|
|
| 41.71 (1.26) [Max: 44.57% Min: 39.95%] | 43.69 (0.95) [Max: 45.41%, Min: 42.65%] |
|
| 26.80 (1.68) [Max: 30.21%, Min: 24.28%] | 28.69 (1.83) [Max: 31.28%, Min: 23.68%] | |
|
| 73.37 (3.72) [Max: 80.55%, Min: 67.55%] | 38.68 (1.97) [Max: 43.64%, Min: 36.25%] | |
|
|
| 38.68 (0.80) [Max: 40.42%, Min: 37.71%] | 38.95 (0.88) [Max: 41.38%, Min: 38.32%] |
|
| 28.88 (1.76) [Max: 31.67%, Min: 25.63%] | 32.58 (2.36) [Max: 34.28%, Min: 26.32%] | |
|
| 52.15 (2.16) [Max: 55.62%, Min: 48.38%] | 40.80 (2.04) [Max: 43.77%, Min: 38.15%] | |
|
|
| 35.19 (0.40) [Max: 35.68%, Min: 34.40%] | 34.97 (0.41) [Max: 35.58%, Min: 34.11%] |
|
| 36.06 (0.65) [Max: 37.88%, Min: 35.83%] | 34.93 (0.80) [Max: 35.61%, Min: 32.77%] | |
|
| 41.32 (0.77) [Max: 42.55%, Min: 39.92%] | 35.21 (0.55) [Max: 36.71%, Min: 34.69%] | |
|
|
| 33.13 (0.31) [Max: 33.69%, Min: 32.70%] | 33.88 (0.39) [Max: 34.42%, Min: 33.10%] |
|
| 32.52 (1.32) [Max: 35.85%, Min: 30.58%] | 34.78 (1.97) [Max: 36.93%, Min: 29.62%] | |
|
| 36.23 (1.62) [Max: 37.50%, Min: 31.67%] | 34.25 (2.19) [Max: 36.40%, Min: 30.05%] | |
|
|
| 36.15 (0.59) [Max: 37.65%, Min: 35.61%] | 34.50 (0.59) [Max: 35.37%, Min: 33.37%] |
|
| 32.13 (3.02) [Max: 35.55%, Min: 25.85%] | 25.32 (3.46) [Max: 29.40%, Min: 19.53%] | |
|
| 48.24 (1.88) [Max: 52.05%, Min: 46.13%] | 37.32 (2.09) [Max: 40.76%, Min: 34.10%] | |
|
|
| 37.02 (0.66) [Max: 37.89%, Min: 36.09%] | 37.93 (0.98) [Max: 39.52%, Min: 36.79%] |
|
| 37.08 (0.52) [Max: 38.03%, Min: 36.35%] | 40.73 (1.75) [Max: 44.60%, Min: 38.37%] | |
|
| 41.66 (1.56) [Max: 44.55%, Min: 38.67%] | 37.61 (1.65) [Max: 41.48%, Min: 35.36%] |