Literature DB >> 20055540

Dynamic information for the recognition of conversational expressions.

Douglas W Cunningham1, Christian Wallraven.   

Abstract

Communication is critical for normal, everyday life. During a conversation, information is conveyed in a number of ways, including through body, head, and facial changes. While much research has examined these latter forms of communication, the majority of it has focused on static representations of a few, supposedly universal expressions. Normal conversations, however, contain a very wide variety of expressions and are rarely, if ever, static. Here, we report several experiments that show that expressions that use head, eye, and internal facial motion are recognized more easily and accurately than static versions of those expressions. Moreover, we demonstrate conclusively that this dynamic advantage is due to information that is only available over time, and that the temporal integration window for this information is at least 100 ms long.

Mesh:

Year:  2009        PMID: 20055540     DOI: 10.1167/9.13.7

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  21 in total

1.  Decoding facial expressions based on face-selective and motion-sensitive areas.

Authors:  Yin Liang; Baolin Liu; Junhai Xu; Gaoyan Zhang; Xianglin Li; Peiyuan Wang; Bin Wang
Journal:  Hum Brain Mapp       Date:  2017-03-27       Impact factor: 5.038

2.  The efficiency of dynamic and static facial expression recognition.

Authors:  Jason M Gold; Jarrett D Barker; Shawn Barr; Jennifer L Bittner; W Drew Bromfield; Nicole Chu; Roy A Goode; Doori Lee; Michael Simmons; Aparna Srinath
Journal:  J Vis       Date:  2013-04-25       Impact factor: 2.240

Review 3.  Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements.

Authors:  Lisa Feldman Barrett; Ralph Adolphs; Stacy Marsella; Aleix M Martinez; Seth D Pollak
Journal:  Psychol Sci Public Interest       Date:  2019-07

4.  The Sabancı University Dynamic Face Database (SUDFace): Development and validation of an audiovisual stimulus set of recited and free speeches with neutral facial expressions.

Authors:  Yağmur Damla Şentürk; Ebru Ecem Tavacioglu; İlker Duymaz; Bilge Sayim; Nihan Alp
Journal:  Behav Res Methods       Date:  2022-08-26

5.  Exploring the Use of Isolated Expressions and Film Clips to Evaluate Emotion Recognition by People with Traumatic Brain Injury.

Authors:  Barbra Zupan; Dawn Neumann
Journal:  J Vis Exp       Date:  2016-05-15       Impact factor: 1.355

6.  The MPI facial expression database--a validated database of emotional and conversational facial expressions.

Authors:  Kathrin Kaulard; Douglas W Cunningham; Heinrich H Bülthoff; Christian Wallraven
Journal:  PLoS One       Date:  2012-03-15       Impact factor: 3.240

7.  Perception of temporal asymmetries in dynamic facial expressions.

Authors:  Maren Reinl; Andreas Bartels
Journal:  Front Psychol       Date:  2015-08-04

8.  Age-Related Response Bias in the Decoding of Sad Facial Expressions.

Authors:  Mara Fölster; Ursula Hess; Isabell Hühnel; Katja Werheid
Journal:  Behav Sci (Basel)       Date:  2015-10-27

9.  Common cues to emotion in the dynamic facial expressions of speech and song.

Authors:  Steven R Livingstone; William F Thompson; Marcelo M Wanderley; Caroline Palmer
Journal:  Q J Exp Psychol (Hove)       Date:  2014-11-25       Impact factor: 2.143

10.  The recognition of facial expressions of emotion in deaf and hearing individuals.

Authors:  Helen Rodger; Junpeng Lao; Chloé Stoll; Anne-Raphaëlle Richoz; Olivier Pascalis; Matthew Dye; Roberto Caldara
Journal:  Heliyon       Date:  2021-05-15
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.