Literature DB >> 26353320

Context-Sensitive Dynamic Ordinal Regression for Intensity Estimation of Facial Action Units.

Ognjen Rudovic, Vladimir Pavlovic, Maja Pantic.   

Abstract

Modeling intensity of facial action units from spontaneously displayed facial expressions is challenging mainly because of high variability in subject-specific facial expressiveness, head-movements, illumination changes, etc. These factors make the target problem highly context-sensitive. However, existing methods usually ignore this context-sensitivity of the target problem. We propose a novel Conditional Ordinal Random Field (CORF) model for context-sensitive modeling of the facial action unit intensity, where the W5+ (who, when, what, where, why and how) definition of the context is used. While the proposed model is general enough to handle all six context questions, in this paper we focus on the context questions: who (the observed subject), how (the changes in facial expressions), and when (the timing of facial expressions and their intensity). The context questions who and howare modeled by means of the newly introduced context-dependent covariate effects, and the context question when is modeled in terms of temporal correlation between the ordinal outputs, i.e., intensity levels of action units. We also introduce a weighted softmax-margin learning of CRFs from data with skewed distribution of the intensity levels, which is commonly encountered in spontaneous facial data. The proposed model is evaluated on intensity estimation of pain and facial action units using two recently published datasets (UNBC Shoulder Pain and DISFA) of spontaneously displayed facial expressions. Our experiments show that the proposed model performs significantly better on the target tasks compared to the state-of-the-art approaches. Furthermore, compared to traditional learning of CRFs, we show that the proposed weighted learning results in more robust parameter estimation from the imbalanced intensity data.

Entities:  

Mesh:

Year:  2015        PMID: 26353320     DOI: 10.1109/TPAMI.2014.2356192

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  5 in total

1.  Confidence Preserving Machine for Facial Action Unit Detection.

Authors:  Fernando De la Torre; Jeffrey F Cohn
Journal:  IEEE Trans Image Process       Date:  2016-07-27       Impact factor: 10.856

2.  Learning Facial Action Units with Spatiotemporal Cues and Multi-label Sampling.

Authors:  Wen-Sheng Chu; Fernando De la Torre; Jeffrey F Cohn
Journal:  Image Vis Comput       Date:  2018-10-28       Impact factor: 2.818

3.  Selective Transfer Machine for Personalized Facial Expression Analysis.

Authors:  Fernando De la Torre; Jeffrey F Cohn
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2016-03-28       Impact factor: 6.226

4.  Discriminant Functional Learning of Color Features for the Recognition of Facial Action Units and Their Intensities.

Authors:  C Fabian Benitez-Quiroz; Ramprakash Srinivasan; Aleix M Martinez
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2018-09-05       Impact factor: 6.226

5.  The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset.

Authors:  Min S H Aung; Sebastian Kaltwang; Bernardino Romera-Paredes; Brais Martinez; Aneesha Singh; Matteo Cella; Michel Valstar; Hongying Meng; Andrew Kemp; Moshen Shafizadeh; Aaron C Elkins; Natalie Kanakam; Amschel de Rothschild; Nick Tyler; Paul J Watson; Amanda C de C Williams; Maja Pantic; Nadia Bianchi-Berthouze
Journal:  IEEE Trans Affect Comput       Date:  2015-07-30       Impact factor: 10.506

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.