Literature DB >> 33801739

Context-Aware Emotion Recognition in the Wild Using Spatio-Temporal and Temporal-Pyramid Models.

Nhu-Tai Do1, Soo-Hyung Kim1, Hyung-Jeong Yang1, Guee-Sang Lee1, Soonja Yeom2.   

Abstract

Emotion recognition plays an important role in human-computer interactions. Recent studies have focused on video emotion recognition in the wild and have run into difficulties related to occlusion, illumination, complex behavior over time, and auditory cues. State-of-the-art methods use multiple modalities, such as frame-level, spatiotemporal, and audio approaches. However, such methods have difficulties in exploiting long-term dependencies in temporal information, capturing contextual information, and integrating multi-modal information. In this paper, we introduce a multi-modal flexible system for video-based emotion recognition in the wild. Our system tracks and votes on significant faces corresponding to persons of interest in a video to classify seven basic emotions. The key contribution of this study is that it proposes the use of face feature extraction with context-aware and statistical information for emotion recognition. We also build two model architectures to effectively exploit long-term dependencies in temporal information with a temporal-pyramid model and a spatiotemporal model with "Conv2D+LSTM+3DCNN+Classify" architecture. Finally, we propose the best selection ensemble to improve the accuracy of multi-modal fusion. The best selection ensemble selects the best combination from spatiotemporal and temporal-pyramid models to achieve the best accuracy for classifying the seven basic emotions. In our experiment, we take benchmark measurement on the AFEW dataset with high accuracy.

Entities:  

Keywords:  best selection ensemble; facial emotion recognition; spatiotemporal; temporal-pyramid; video emotion recognition

Year:  2021        PMID: 33801739      PMCID: PMC8036494          DOI: 10.3390/s21072344

Source DB:  PubMed          Journal:  Sensors (Basel)        ISSN: 1424-8220            Impact factor:   3.576


  14 in total

1.  Tracking-Learning-Detection.

Authors:  Zdenek Kalal; Krystian Mikolajczyk; Jiri Matas
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2011-12-13       Impact factor: 6.226

2.  A dynamic texture-based approach to recognition of facial actions and their temporal models.

Authors:  Sander Koelstra; Maja Pantic; Ioannis Patras
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2010-11       Impact factor: 6.226

3.  Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences.

Authors:  Maja Pantic; Ioannis Patras
Journal:  IEEE Trans Syst Man Cybern B Cybern       Date:  2006-04

4.  Dynamic texture recognition using local binary patterns with an application to facial expressions.

Authors:  Guoying Zhao; Matti Pietikäinen
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2007-06       Impact factor: 6.226

5.  Facial expression recognition in image sequences using geometric deformation features and Support Vector Machines.

Authors:  Irene Kotsia; Ioannis Pitas
Journal:  IEEE Trans Image Process       Date:  2007-01       Impact factor: 10.856

6.  Emotion recognition from expressions in face, voice, and body: the Multimodal Emotion Recognition Test (MERT).

Authors:  Tanja Bänziger; Didier Grandjean; Klaus R Scherer
Journal:  Emotion       Date:  2009-10

7.  Long short-term memory.

Authors:  S Hochreiter; J Schmidhuber
Journal:  Neural Comput       Date:  1997-11-15       Impact factor: 2.026

Review 8.  Survey on RGB, 3D, Thermal, and Multimodal Approaches for Facial Expression Recognition: History, Trends, and Affect-Related Applications.

Authors:  Ciprian Adrian Corneanu; Marc Oliu Simon; Jeffrey F Cohn; Sergio Escalera Guerrero
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2016-01-07       Impact factor: 6.226

9.  Skeleton-Based Emotion Recognition Based on Two-Stream Self-Attention Enhanced Spatial-Temporal Graph Convolutional Network.

Authors:  Jiaqi Shi; Chaoran Liu; Carlos Toshinori Ishi; Hiroshi Ishiguro
Journal:  Sensors (Basel)       Date:  2020-12-30       Impact factor: 3.576

10.  Faces in context: a review and systematization of contextual influences on affective face processing.

Authors:  Matthias J Wieser; Tobias Brosch
Journal:  Front Psychol       Date:  2012-11-02
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.