Literature DB >> 16817522

A motion capture library for the study of identity, gender, and emotion perception from biological motion.

Yingliang Ma1, Helena M Paterson, Frank E Pollick.   

Abstract

We present the methods that were used in capturing a library of human movements for use in computer-animated displays of human movement. The library is an attempt to systematically tap into and represent the wide range of personal properties, such as identity, gender, and emotion, that are available in a person's movements. The movements from a total of 30 nonprofessional actors (15 of them female) were captured while they performed walking, knocking, lifting, and throwing actions, as well as their combination in angry, happy, neutral, and sad affective styles. From the raw motion capture data, a library of 4,080 movements was obtained, using techniques based on Character Studio (plug-ins for 3D Studio MAX, AutoDesk, Inc.), MATLAB The MathWorks, Inc.), or a combination of these two. For the knocking, lifting, and throwing actions, 10 repetitions of the simple action unit were obtained for each affect, and for the other actions, two longer movement recordings were obtained for each affect. We discuss the potential use of the library for computational and behavioral analyses of movement variability, of human character animation, and of how gender, emotion, and identity are encoded and decoded from human movement.

Entities:  

Mesh:

Year:  2006        PMID: 16817522     DOI: 10.3758/bf03192758

Source DB:  PubMed          Journal:  Behav Res Methods        ISSN: 1554-351X


  19 in total

1.  A multi-sensory code for emotional arousal.

Authors:  Beau Sievers; Caitlyn Lee; William Haslett; Thalia Wheatley
Journal:  Proc Biol Sci       Date:  2019-07-10       Impact factor: 5.349

2.  Three-dimensional pose discrimination in natural images of humans.

Authors:  Hongru Zhu; Alan Yuille; Daniel Kersten
Journal:  Cogsci       Date:  2021-07

3.  Construction and validation of the Dalian emotional movement open-source set (DEMOS).

Authors:  Mingming Zhang; Lu Yu; Keye Zhang; Bixuan Du; Bin Zhan; Shuxin Jia; Shaohua Chen; Fengxu Han; Yiwen Li; Shuaicheng Liu; Xi Yi; Shenglan Liu; Wenbo Luo
Journal:  Behav Res Methods       Date:  2022-08-05

4.  The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset.

Authors:  Min S H Aung; Sebastian Kaltwang; Bernardino Romera-Paredes; Brais Martinez; Aneesha Singh; Matteo Cella; Michel Valstar; Hongying Meng; Andrew Kemp; Moshen Shafizadeh; Aaron C Elkins; Natalie Kanakam; Amschel de Rothschild; Nick Tyler; Paul J Watson; Amanda C de C Williams; Maja Pantic; Nadia Bianchi-Berthouze
Journal:  IEEE Trans Affect Comput       Date:  2015-07-30       Impact factor: 10.506

5.  Signature movements lead to efficient search for threatening actions.

Authors:  Jeroen J A van Boxtel; Hongjing Lu
Journal:  PLoS One       Date:  2012-05-23       Impact factor: 3.240

Review 6.  Locality sensitivity discriminant analysis-based feature ranking of human emotion actions recognition.

Authors:  Nurnadia M Khair; M Hariharan; S Yaacob; Shafriza Nisha Basah
Journal:  J Phys Ther Sci       Date:  2015-08-21

7.  Audiovisual integration of emotional signals from others' social interactions.

Authors:  Lukasz Piwek; Frank Pollick; Karin Petrini
Journal:  Front Psychol       Date:  2015-05-08

8.  Tactile input and empathy modulate the perception of ambiguous biological motion.

Authors:  Hörmetjan Yiltiz; Lihan Chen
Journal:  Front Psychol       Date:  2015-02-20

9.  Emotion through locomotion: gender impact.

Authors:  Samuel Krüger; Alexander N Sokolov; Paul Enck; Ingeborg Krägeloh-Mann; Marina A Pavlova
Journal:  PLoS One       Date:  2013-11-22       Impact factor: 3.240

10.  A database of whole-body action videos for the study of action, emotion, and untrustworthiness.

Authors:  Bruce D Keefe; Matthias Villing; Chris Racey; Samantha L Strong; Joanna Wincenciak; Nick E Barraclough
Journal:  Behav Res Methods       Date:  2014-12
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.