Literature DB >> 22642350

Reliable facial muscle activation enhances recognizability and credibility of emotional expression.

Marc Mehu1, Marcello Mortillaro1, Tanja Bänziger1, Klaus R Scherer1.   

Abstract

We tested Ekman's (2003) suggestion that movements of a small number of reliable facial muscles are particularly trustworthy cues to experienced emotion because they tend to be difficult to produce voluntarily. On the basis of theoretical predictions, we identified two subsets of facial action units (AUs): reliable AUs and versatile AUs. A survey on the controllability of facial AUs confirmed that reliable AUs indeed seem more difficult to control than versatile AUs, although the distinction between the two sets of AUs should be understood as a difference in degree of controllability rather than a discrete categorization. Professional actors enacted a series of emotional states using method acting techniques, and their facial expressions were rated by independent judges. The effect of the two subsets of AUs (reliable AUs and versatile AUs) on identification of the emotion conveyed, its perceived authenticity, and perceived intensity was investigated. Activation of the reliable AUs had a stronger effect than that of versatile AUs on the identification, perceived authenticity, and perceived intensity of the emotion expressed. We found little evidence, however, for specific links between individual AUs and particular emotion categories. We conclude that reliable AUs may indeed convey trustworthy information about emotional processes but that most of these AUs are likely to be shared by several emotions rather than providing information about specific emotions. This study also suggests that the issue of reliable facial muscles may generalize beyond the Duchenne smile.

Entities:  

Mesh:

Year:  2012        PMID: 22642350     DOI: 10.1037/a0026717

Source DB:  PubMed          Journal:  Emotion        ISSN: 1528-3542


  13 in total

Review 1.  A psycho-ethological approach to social signal processing.

Authors:  Marc Mehu; Klaus R Scherer
Journal:  Cogn Process       Date:  2012-02-11

2.  Proprioceptive ability at the lips and jaw measured using the same psychophysical discrimination task.

Authors:  Ellie Frayne; Susan Coulson; Roger Adams; Glen Croxson; Gordon Waddington
Journal:  Exp Brain Res       Date:  2016-02-09       Impact factor: 1.972

3.  Interpretable Self-Supervised Facial Micro-Expression Learning to Predict Cognitive State and Neurological Disorders.

Authors:  Arun Das; Jeffrey Mock; Yufei Huang; Edward Golob; Peyman Najafirad
Journal:  Proc Conf AAAI Artif Intell       Date:  2021-05-18

4.  Children can discriminate the authenticity of happy but not sad or fearful facial expressions, and use an immature intensity-only strategy.

Authors:  Amy Dawel; Romina Palermo; Richard O'Kearney; Elinor McKone
Journal:  Front Psychol       Date:  2015-05-05

5.  A False Trail to Follow: Differential Effects of the Facial Feedback Signals From the Upper and Lower Face on the Recognition of Micro-Expressions.

Authors:  Xuemei Zeng; Qi Wu; Siwei Zhang; Zheying Liu; Qing Zhou; Meishan Zhang
Journal:  Front Psychol       Date:  2018-10-24

6.  Naturalistic Emotion Decoding From Facial Action Sets.

Authors:  Sylwia Hyniewska; Wataru Sato; Susanne Kaiser; Catherine Pelachaud
Journal:  Front Psychol       Date:  2019-01-18

7.  Micro-Expressions of Fear During the 2016 Presidential Campaign Trail: Their Influence on Trait Perceptions of Donald Trump.

Authors:  Patrick A Stewart; Elena Svetieva
Journal:  Front Psychol       Date:  2021-06-02

8.  The integration of emotional and symbolic components in multimodal communication.

Authors:  Marc Mehu
Journal:  Front Psychol       Date:  2015-07-07

9.  Candidate Performance and Observable Audience Response: Laughter and Applause-Cheering During the First 2016 Clinton-Trump Presidential Debate.

Authors:  Patrick A Stewart; Austin D Eubanks; Reagan G Dye; Zijian H Gong; Erik P Bucy; Robert H Wicks; Scott Eidelman
Journal:  Front Psychol       Date:  2018-07-20

10.  Human perception and biosignal-based identification of posed and spontaneous smiles.

Authors:  Monica Perusquía-Hernández; Saho Ayabe-Kanamura; Kenji Suzuki
Journal:  PLoS One       Date:  2019-12-12       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.