Literature DB >> 19803591

Emotion recognition from expressions in face, voice, and body: the Multimodal Emotion Recognition Test (MERT).

Tanja Bänziger1, Didier Grandjean, Klaus R Scherer.   

Abstract

Emotion recognition ability has been identified as a central component of emotional competence. We describe the development of an instrument that objectively measures this ability on the basis of actor portrayals of dynamic expressions of 10 emotions (2 variants each for 5 emotion families), operationalized as recognition accuracy in 4 presentation modes combining the visual and auditory sense modalities (audio/video, audio only, video only, still picture). Data from a large validation study, including construct validation using related tests (Profile of Nonverbal Sensitivity; Rosenthal, Hall, DiMatteo, Rogers, & Archer, 1979; Japanese and Caucasian Facial Expressions of Emotion; Biehl et al., 1997; Diagnostic Analysis of Nonverbal Accuracy; Nowicki & Duke, 1994; Emotion Recognition Index; Scherer & Scherer, 2008), are reported. The results show the utility of a test designed to measure both coarse and fine-grained emotion differentiation and modality-specific skills. Factor analysis of the data suggests 2 separate abilities, visual and auditory recognition, which seem to be largely independent of personality dispositions.

Mesh:

Year:  2009        PMID: 19803591     DOI: 10.1037/a0017088

Source DB:  PubMed          Journal:  Emotion        ISSN: 1528-3542


  44 in total

1.  How is this child feeling? Preschool-aged children's ability to recognize emotion in faces and body poses.

Authors:  Alison E Parker; Erin T Mathis; Janis B Kupersmidt
Journal:  Early Educ Dev       Date:  2013-02-07

2.  Effective connectivity predicts cognitive empathy in cocaine addiction: a spectral dynamic causal modeling study.

Authors:  Luqing Wei; Guo-Rong Wu; Minghua Bi; Chris Baeken
Journal:  Brain Imaging Behav       Date:  2021-06       Impact factor: 3.978

3.  Development and Feasibility of MindChip™: A Social Emotional Telehealth Intervention for Autistic Adults.

Authors:  Julia S Y Tang; Marita Falkmer; Nigel T M Chen; Sven Bӧlte; Sonya Girdler
Journal:  J Autism Dev Disord       Date:  2021-04

4.  FACSHuman, a software program for creating experimental material by modeling 3D facial expressions.

Authors:  Michaël Gilbert; Samuel Demarchi; Isabel Urdapilleta
Journal:  Behav Res Methods       Date:  2021-04-06

5.  EUReKA! A Conceptual Model of Emotion Understanding.

Authors:  Vanessa L Castro; Yanhua Cheng; Amy G Halberstadt; Daniel Grühn
Journal:  Emot Rev       Date:  2015-04-22

6.  The Mandarin Chinese auditory emotions stimulus database: A validated set of Chinese pseudo-sentences.

Authors:  Bingyan Gong; Na Li; Qiuhong Li; Xinyuan Yan; Jing Chen; Liang Li; Xihong Wu; Chao Wu
Journal:  Behav Res Methods       Date:  2022-05-31

7.  The Jena Voice Learning and Memory Test (JVLMT): A standardized tool for assessing the ability to learn and recognize voices.

Authors:  Denise Humble; Stefan R Schweinberger; Axel Mayer; Tim L Jesgarzewsky; Christian Dobel; Romi Zäske
Journal:  Behav Res Methods       Date:  2022-06-01

8.  Perceived Intensity of Emotional Point-Light Displays is Reduced in Subjects with ASD.

Authors:  Britta Krüger; Morten Kaletsch; Sebastian Pilgramm; Sven-Sören Schwippert; Jürgen Hennig; Rudolf Stark; Stefanie Lis; Bernd Gallhofer; Gebhard Sammer; Karen Zentgraf; Jörn Munzert
Journal:  J Autism Dev Disord       Date:  2018-01

9.  Context-Aware Emotion Recognition in the Wild Using Spatio-Temporal and Temporal-Pyramid Models.

Authors:  Nhu-Tai Do; Soo-Hyung Kim; Hyung-Jeong Yang; Guee-Sang Lee; Soonja Yeom
Journal:  Sensors (Basel)       Date:  2021-03-27       Impact factor: 3.576

10.  Crossmodal adaptation in right posterior superior temporal sulcus during face-voice emotional integration.

Authors:  Rebecca Watson; Marianne Latinus; Takao Noguchi; Oliver Garrod; Frances Crabbe; Pascal Belin
Journal:  J Neurosci       Date:  2014-05-14       Impact factor: 6.167

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.