| Literature DB >> 27672492 |
Shun Li1, Liqing Cui1, Changye Zhu2, Baobin Li2, Nan Zhao3, Tingshao Zhu3.
Abstract
Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker's emotional state, and could be an information source for emotion recognition. This paper proposed a novel method to recognize emotional state through human gaits by using Microsoft Kinect, a low-cost, portable, camera-based sensor. Fifty-nine participants' gaits under neutral state, induced anger and induced happiness were recorded by two Kinect cameras, and the original data were processed through joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation. Features of gait patterns were extracted from 3-dimentional coordinates of 14 main body joints by Fourier transformation and Principal Component Analysis (PCA). The classifiers NaiveBayes, RandomForests, LibSVM and SMO (Sequential Minimal Optimization) were trained and evaluated, and the accuracy of recognizing anger and happiness from neutral state achieved 80.5% and 75.4%. Although the results of distinguishing angry and happiness states were not ideal in current study, it showed the feasibility of automatically recognizing emotional states from gaits, with the characteristics meeting the application requirements.Entities:
Keywords: Affective computing; Emotion recognition; Gait; Kinect; Machine learning
Year: 2016 PMID: 27672492 PMCID: PMC5028730 DOI: 10.7717/peerj.2364
Source DB: PubMed Journal: PeerJ ISSN: 2167-8359 Impact factor: 2.984
Figure 1The experiment scene.
Figure 2The schematic of the experiment environment.
Figure 3The procedures of the first round experiment.
Figure 4Stick figure and location of body joint centers recorded by Kinect.
Self-report emotional states before and after emotion priming.
| Before priming(BP) | After priming I (API:before walking) | After priming II (APII:after walking) | |
|---|---|---|---|
| Round 1: anger priming | 1.44(.93) | 6.46(1.99) | 5.08(1.98) |
| Round 2: happiness priming | 3.88(2.49) | 6.61(2.22) | 5.63(2.24) |
Notes.
The average of participants’ self-ratings was shown in the table with the standard deviation in the parenthesis.
The accuracy of recognizing angry and neutral.
| NaiveBayes | RandomForests | LibSVM | SMO | |
|---|---|---|---|---|
| KINECT1 | 80.5085 | 52.5424 | 72.0339 | 52.5424 |
| KINECT2 | 75.4237 | − | 71.1864 | − |
Notes.
Table entries are accuracies expressed as a percentage. Values below chance level (50%) are not presented.
The accuracy of recognizing happy and neutral.
| NaiveBayes | RandomForests | LibSVM | SMO | |
|---|---|---|---|---|
| KINECT1 | 79.6610 | 51.6949 | 77.9661 | − |
| KINECT2 | 61.8644 | 51.6949 | 52.5414 | − |
Notes.
Table entries are accuracies expressed as a percentage. Values below chance level (50%) are not presented.
The accuracy of recognizing angry and happy.
| NaiveBayes | RandomForests | LibSVM | SMO | |
|---|---|---|---|---|
| KINECT1 | 52.5424 | 55.0847 | − | 51.6949 |
| KINECT2 | − | 51.6949 | − | 50.8475 |
Notes.
Table entries are accuracies expressed as a percentage. Values below chance level (50%) are not presented.