Literature DB >> 36018484

The Sabancı University Dynamic Face Database (SUDFace): Development and validation of an audiovisual stimulus set of recited and free speeches with neutral facial expressions.

Yağmur Damla Şentürk1, Ebru Ecem Tavacioglu1, İlker Duymaz1, Bilge Sayim2,3, Nihan Alp4.   

Abstract

Faces convey a wide range of information, including one's identity, and emotional and mental states. Face perception is a major research topic in many research fields, such as cognitive science, social psychology, and neuroscience. Frequently, stimuli are selected from a range of available face databases. However, even though faces are highly dynamic, most databases consist of static face stimuli. Here, we introduce the Sabancı University Dynamic Face (SUDFace) database. The SUDFace database consists of 150 high-resolution audiovisual videos acquired in a controlled lab environment and stored with a resolution of 1920 × 1080 pixels at a frame rate of 60 Hz. The multimodal database consists of three videos of each human model in frontal view in three different conditions: vocalizing two scripted texts (conditions 1 and 2) and one Free Speech (condition 3). The main focus of the SUDFace database is to provide a large set of dynamic faces with neutral facial expressions and natural speech articulation. Variables such as face orientation, illumination, and accessories (piercings, earrings, facial hair, etc.) were kept constant across all stimuli. We provide detailed stimulus information, including facial features (pixel-wise calculations of face length, eye width, etc.) and speeches (e.g., duration of speech and repetitions). In two validation experiments, a total number of 227 participants rated each video on several psychological dimensions (e.g., neutralness and naturalness of expressions, valence, and the perceived mental states of the models) using Likert scales. The database is freely accessible for research purposes.
© 2022. The Psychonomic Society, Inc.

Entities:  

Keywords:  Dynamic face; Face database; Face recognition; Natural face; Neutral face; Speech recognition

Year:  2022        PMID: 36018484     DOI: 10.3758/s13428-022-01951-z

Source DB:  PubMed          Journal:  Behav Res Methods        ISSN: 1554-351X


  55 in total

1.  Early processing of the six basic facial emotional expressions.

Authors:  Magali Batty; Margot J Taylor
Journal:  Brain Res Cogn Brain Res       Date:  2003-10

2.  SPEECH RATE, FILLED PAUSE, AND BODY MOVEMENT IN INTERVIEWS.

Authors:  D S BOOMER; A T DITTMANN
Journal:  J Nerv Ment Dis       Date:  1964-10       Impact factor: 2.254

3.  Sex differences in perception of emotion intensity in dynamic and static facial expressions.

Authors:  Cezary Biele; Anna Grabowska
Journal:  Exp Brain Res       Date:  2006-01-26       Impact factor: 1.972

4.  The confounded nature of angry men and happy women.

Authors:  D Vaughn Becker; Douglas T Kenrick; Steven L Neuberg; K C Blackwell; Dylan M Smith
Journal:  J Pers Soc Psychol       Date:  2007-02

5.  Facial motion in the perception of faces and of emotional expression.

Authors:  J N Bassili
Journal:  J Exp Psychol Hum Percept Perform       Date:  1978-08       Impact factor: 3.332

6.  The effect of visual distraction on auditory-visual speech perception by younger and older listeners.

Authors:  Julie I Cohen; Sandra Gordon-Salant
Journal:  J Acoust Soc Am       Date:  2017-05       Impact factor: 1.840

7.  Out of sight but not out of mind: unseen affective faces influence evaluations and social impressions.

Authors:  Eric Anderson; Erika Siegel; Dominique White; Lisa Feldman Barrett
Journal:  Emotion       Date:  2012-04-16

8.  CREMA-D: Crowd-sourced Emotional Multimodal Actors Dataset.

Authors:  Houwei Cao; David G Cooper; Michael K Keutmann; Ruben C Gur; Ani Nenkova; Ragini Verma
Journal:  IEEE Trans Affect Comput       Date:  2014 Oct-Dec       Impact factor: 10.506

9.  Facial structure is a reliable cue of aggressive behavior.

Authors:  Justin M Carré; Cheryl M McCormick; Catherine J Mondloch
Journal:  Psychol Sci       Date:  2009-08-14

10.  Development and Validation of the Yonsei Face Database (YFace DB).

Authors:  Kyong-Mee Chung; Soojin Kim; Woo Hyun Jung; Yeunjoo Kim
Journal:  Front Psychol       Date:  2019-12-03
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.