Literature DB >> 28406679

Gently does it: Humans outperform a software classifier in recognizing subtle, nonstereotypical facial expressions.

Neta Yitzhak1, Nir Giladi2, Tanya Gurevich2, Daniel S Messinger3, Emily B Prince4, Katherine Martin4, Hillel Aviezer1.   

Abstract

According to dominant theories of affect, humans innately and universally express a set of emotions using specific configurations of prototypical facial activity. Accordingly, thousands of studies have tested emotion recognition using sets of highly intense and stereotypical facial expressions, yet their incidence in real life is virtually unknown. In fact, a commonplace experience is that emotions are expressed in subtle and nonprototypical forms. Such facial expressions are at the focus of the current study. In Experiment 1, we present the development and validation of a novel stimulus set consisting of dynamic and subtle emotional facial displays conveyed without constraining expressers to using prototypical configurations. Although these subtle expressions were more challenging to recognize than prototypical dynamic expressions, they were still well recognized by human raters, and perhaps most importantly, they were rated as more ecological and naturalistic than the prototypical expressions. In Experiment 2, we examined the characteristics of subtle versus prototypical expressions by subjecting them to a software classifier, which used prototypical basic emotion criteria. Although the software was highly successful at classifying prototypical expressions, it performed very poorly at classifying the subtle expressions. Further validation was obtained from human expert face coders: Subtle stimuli did not contain many of the key facial movements present in prototypical expressions. Together, these findings suggest that emotions may be successfully conveyed to human viewers using subtle nonprototypical expressions. Although classic prototypical facial expressions are well recognized, they appear less naturalistic and may not capture the richness of everyday emotional communication. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

Entities:  

Mesh:

Year:  2017        PMID: 28406679     DOI: 10.1037/emo0000287

Source DB:  PubMed          Journal:  Emotion        ISSN: 1528-3542


  7 in total

Review 1.  Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements.

Authors:  Lisa Feldman Barrett; Ralph Adolphs; Stacy Marsella; Aleix M Martinez; Seth D Pollak
Journal:  Psychol Sci Public Interest       Date:  2019-07

Review 2.  Commercial Use of Emotion Artificial Intelligence (AI): Implications for Psychiatry.

Authors:  Scott Monteith; Tasha Glenn; John Geddes; Peter C Whybrow; Michael Bauer
Journal:  Curr Psychiatry Rep       Date:  2022-02-25       Impact factor: 5.285

Review 3.  Closed Loop Deep Brain Stimulation for PTSD, Addiction, and Disorders of Affective Facial Interpretation: Review and Discussion of Potential Biomarkers and Stimulation Paradigms.

Authors:  Robert W Bina; Jean-Phillipe Langevin
Journal:  Front Neurosci       Date:  2018-05-04       Impact factor: 4.677

4.  What's in a face: Automatic facial coding of untrained study participants compared to standardized inventories.

Authors:  T Tim A Höfling; Georg W Alpers; Björn Büdenbender; Ulrich Föhl; Antje B M Gerdes
Journal:  PLoS One       Date:  2022-03-03       Impact factor: 3.240

5.  Opportunities and Challenges for Using Automatic Human Affect Analysis in Consumer Research.

Authors:  Dennis Küster; Eva G Krumhuber; Lars Steinert; Anuj Ahuja; Marc Baker; Tanja Schultz
Journal:  Front Neurosci       Date:  2020-04-28       Impact factor: 4.677

6.  A performance comparison of eight commercially available automatic classifiers for facial affect recognition.

Authors:  Damien Dupré; Eva G Krumhuber; Dennis Küster; Gary J McKeown
Journal:  PLoS One       Date:  2020-04-24       Impact factor: 3.240

7.  Human and machine validation of 14 databases of dynamic facial expressions.

Authors:  Eva G Krumhuber; Dennis Küster; Shushi Namba; Lina Skora
Journal:  Behav Res Methods       Date:  2021-04
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.