Literature DB >> 33501109

Introducing ACASS: An Annotated Character Animation Stimulus Set for Controlled (e)Motion Perception Studies.

Sebastian Lammers1,2, Gary Bente3, Ralf Tepest1, Mathis Jording2, Daniel Roth4, Kai Vogeley1,2.   

Abstract

Others' movements inform us about their current activities as well as their intentions and emotions. Research on the distinct mechanisms underlying action recognition and emotion inferences has been limited due to a lack of suitable comparative stimulus material. Problematic confounds can derive from low-level physical features (e.g., luminance), as well as from higher-level psychological features (e.g., stimulus difficulty). Here we present a standardized stimulus dataset, which allows to address both action and emotion recognition with identical stimuli. The stimulus set consists of 792 computer animations with a neutral avatar based on full body motion capture protocols. Motion capture was performed on 22 human volunteers, instructed to perform six everyday activities (mopping, sweeping, painting with a roller, painting with a brush, wiping, sanding) in three different moods (angry, happy, sad). Five-second clips of each motion protocol were rendered into AVI-files using two virtual camera perspectives for each clip. In contrast to video stimuli, the computer animations allowed to standardize the physical appearance of the avatar and to control lighting and coloring conditions, thus reducing the stimulus variation to mere movement. To control for low level optical features of the stimuli, we developed and applied a set of MATLAB routines extracting basic physical features of the stimuli, including average background-foreground proportion and frame-by-frame pixel change dynamics. This information was used to identify outliers and to homogenize the stimuli across action and emotion categories. This led to a smaller stimulus subset (n = 83 animations within the 792 clip database) which only contained two different actions (mopping, sweeping) and two different moods (angry, happy). To further homogenize this stimulus subset with regard to psychological criteria we conducted an online observer study (N = 112 participants) to assess the recognition rates for actions and moods, which led to a final sub-selection of 32 clips (eight per category) within the database. The ACASS database and its subsets provide unique opportunities for research applications in social psychology, social neuroscience, and applied clinical studies on communication disorders. All 792 AVI-files, selected subsets, MATLAB code, annotations, and motion capture data (FBX-files) are available online.
Copyright © 2019 Lammers, Bente, Tepest, Jording, Roth and Vogeley.

Entities:  

Keywords:  body motion; experimental paradigms; human interaction; motion capture; non-verbal behavior; social cognition; visual stimuli

Year:  2019        PMID: 33501109      PMCID: PMC7805965          DOI: 10.3389/frobt.2019.00094

Source DB:  PubMed          Journal:  Front Robot AI        ISSN: 2296-9144


  33 in total

1.  Schizophrenia and theory of mind.

Authors:  C D Frith
Journal:  Psychol Med       Date:  2004-04       Impact factor: 7.723

2.  Emotion perception from dynamic and static body expressions in point-light and full-light displays.

Authors:  Anthony P Atkinson; Winand H Dittrich; Andrew J Gemmell; Andrew W Young
Journal:  Perception       Date:  2004       Impact factor: 1.490

3.  Spatio-temporal differentiation and integration in visual motion perception. An experimental and theoretical analysis of calculus-like functions in visual data processing.

Authors:  G Johansson
Journal:  Psychol Res       Date:  1976

Review 4.  Attention, biological motion, and action recognition.

Authors:  James Thompson; Raja Parasuraman
Journal:  Neuroimage       Date:  2011-05-25       Impact factor: 6.556

5.  Effort-Shape and kinematic assessment of bodily expression of emotion during gait.

Authors:  M Melissa Gross; Elizabeth A Crane; Barbara L Fredrickson
Journal:  Hum Mov Sci       Date:  2011-08-10       Impact factor: 2.161

6.  Reconstructing visual experiences from brain activity evoked by natural movies.

Authors:  Shinji Nishimoto; An T Vu; Thomas Naselaris; Yuval Benjamini; Bin Yu; Jack L Gallant
Journal:  Curr Biol       Date:  2011-09-22       Impact factor: 10.834

Review 7.  Two social brains: neural mechanisms of intersubjectivity.

Authors:  Kai Vogeley
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2017-08-19       Impact factor: 6.237

8.  Atypical cross talk between mentalizing and mirror neuron networks in autism spectrum disorder.

Authors:  Inna Fishman; Christopher L Keown; Alan J Lincoln; Jaime A Pineda; Ralph-Axel Müller
Journal:  JAMA Psychiatry       Date:  2014-07-01       Impact factor: 21.596

9.  The visual analysis of emotional actions.

Authors:  Arieta Chouchourelou; Toshihiko Matsuka; Kent Harber; Maggie Shiffrar
Journal:  Soc Neurosci       Date:  2006       Impact factor: 2.083

10.  Naturalistic FMRI mapping reveals superior temporal sulcus as the hub for the distributed brain network for social perception.

Authors:  Juha M Lahnakoski; Enrico Glerean; Juha Salmi; Iiro P Jääskeläinen; Mikko Sams; Riitta Hari; Lauri Nummenmaa
Journal:  Front Hum Neurosci       Date:  2012-08-13       Impact factor: 3.169

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.