Literature DB >> 26548943

Multimodal emotional state recognition using sequence-dependent deep hierarchical features.

Pablo Barros1, Doreen Jirak2, Cornelius Weber3, Stefan Wermter4.   

Abstract

Emotional state recognition has become an important topic for human-robot interaction in the past years. By determining emotion expressions, robots can identify important variables of human behavior and use these to communicate in a more human-like fashion and thereby extend the interaction possibilities. Human emotions are multimodal and spontaneous, which makes them hard to be recognized by robots. Each modality has its own restrictions and constraints which, together with the non-structured behavior of spontaneous expressions, create several difficulties for the approaches present in the literature, which are based on several explicit feature extraction techniques and manual modality fusion. Our model uses a hierarchical feature representation to deal with spontaneous emotions, and learns how to integrate multiple modalities for non-verbal emotion recognition, making it suitable to be used in an HRI scenario. Our experiments show that a significant improvement of recognition accuracy is achieved when we use hierarchical features and multimodal information, and our model improves the accuracy of state-of-the-art approaches from 82.5% reported in the literature to 91.3% for a benchmark dataset on spontaneous emotion expressions.
Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

Entities:  

Keywords:  Convolutional Neural Networks; Deep learning; Emotion recognition; Hierarchical features; Human Robot Interaction

Mesh:

Year:  2015        PMID: 26548943     DOI: 10.1016/j.neunet.2015.09.009

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  2 in total

1.  Developing crossmodal expression recognition based on a deep neural model.

Authors:  Pablo Barros; Stefan Wermter
Journal:  Adapt Behav       Date:  2016-10-10       Impact factor: 1.942

2.  Emotion Recognition from Skeletal Movements.

Authors:  Tomasz Sapiński; Dorota Kamińska; Adam Pelikant; Gholamreza Anbarjafari
Journal:  Entropy (Basel)       Date:  2019-06-29       Impact factor: 2.524

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.