Literature DB >> 23757539

A dynamic appearance descriptor approach to facial actions temporal modeling.

Bihan Jiang, Michel Valstar, Brais Martinez, Maja Pantic.   

Abstract

Both the configuration and the dynamics of facial expressions are crucial for the interpretation of human facial behavior. Yet to date, the vast majority of reported efforts in the field either do not take the dynamics of facial expressions into account, or focus only on prototypic facial expressions of six basic emotions. Facial dynamics can be explicitly analyzed by detecting the constituent temporal segments in Facial Action Coding System (FACS) Action Units (AUs)-onset, apex, and offset. In this paper, we present a novel approach to explicit analysis of temporal dynamics of facial actions using the dynamic appearance descriptor Local Phase Quantization from Three Orthogonal Planes (LPQ-TOP). Temporal segments are detected by combining a discriminative classifier for detecting the temporal segments on a frame-by-frame basis with Markov Models that enforce temporal consistency over the whole episode. The system is evaluated in detail over the MMI facial expression database, the UNBC-McMaster pain database, the SAL database, the GEMEP-FERA dataset in database-dependent experiments, in cross-database experiments using the Cohn-Kanade, and the SEMAINE databases. The comparison with other state-of-the-art methods shows that the proposed LPQ-TOP method outperforms the other approaches for the problem of AU temporal segment detection, and that overall AU activation detection benefits from dynamic appearance information.

Entities:  

Mesh:

Year:  2014        PMID: 23757539     DOI: 10.1109/TCYB.2013.2249063

Source DB:  PubMed          Journal:  IEEE Trans Cybern        ISSN: 2168-2267            Impact factor:   11.448


  4 in total

1.  Cross-domain AU Detection: Domains, Learning Approaches, and Measures.

Authors:  Itir Onal Ertugrul; Jeffrey F Cohn; László A Jeni; Zheng Zhang; Lijun Yin; Qiang Ji
Journal:  Proc Int Conf Autom Face Gesture Recognit       Date:  2019-07-11

2.  The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset.

Authors:  Min S H Aung; Sebastian Kaltwang; Bernardino Romera-Paredes; Brais Martinez; Aneesha Singh; Matteo Cella; Michel Valstar; Hongying Meng; Andrew Kemp; Moshen Shafizadeh; Aaron C Elkins; Natalie Kanakam; Amschel de Rothschild; Nick Tyler; Paul J Watson; Amanda C de C Williams; Maja Pantic; Nadia Bianchi-Berthouze
Journal:  IEEE Trans Affect Comput       Date:  2015-07-30       Impact factor: 10.506

3.  Automatic Detection of Depression in Speech Using Ensemble Convolutional Neural Networks.

Authors:  Adrián Vázquez-Romero; Ascensión Gallardo-Antolín
Journal:  Entropy (Basel)       Date:  2020-06-20       Impact factor: 2.524

4.  Quantifying dynamic facial expressions under naturalistic conditions.

Authors:  Jayson Jeganathan; Megan Campbell; Matthew Hyett; Gordon Parker; Michael Breakspear
Journal:  Elife       Date:  2022-08-31       Impact factor: 8.713

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.