Literature DB >> 24108801

Unconscious presentation of fearful face modulates electrophysiological responses to emotional prosody.

Hirokazu Doi1, Kazuyuki Shinohara1.   

Abstract

Cross-modal integration of visual and auditory emotional cues is supposed to be advantageous in the accurate recognition of emotional signals. However, the neural locus of cross-modal integration between affective prosody and unconsciously presented facial expression in the neurologically intact population is still elusive at this point. The present study examined the influences of unconsciously presented facial expressions on the event-related potentials (ERPs) in emotional prosody recognition. In the experiment, fearful, happy, and neutral faces were presented without awareness by continuous flash suppression simultaneously with voices containing laughter and a fearful shout. The conventional peak analysis revealed that the ERPs were modulated interactively by emotional prosody and facial expression at multiple latency ranges, indicating that audio-visual integration of emotional signals takes place automatically without conscious awareness. In addition, the global field power during the late-latency range was larger for shout than for laughter only when a fearful face was presented unconsciously. The neural locus of this effect was localized to the left posterior fusiform gyrus, giving support to the view that the cortical region, traditionally considered to be unisensory region for visual processing, functions as the locus of audiovisual integration of emotional signals.
© The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

Entities:  

Keywords:  cross-modal integration; facial expression; fear; prosody; unconscious perception

Mesh:

Year:  2013        PMID: 24108801     DOI: 10.1093/cercor/bht282

Source DB:  PubMed          Journal:  Cereb Cortex        ISSN: 1047-3211            Impact factor:   5.357


  6 in total

1.  The integration of facial and vocal cues during emotional change perception: EEG markers.

Authors:  Xuhai Chen; Zhihui Pan; Ping Wang; Xiaohong Yang; Peng Liu; Xuqun You; Jiajin Yuan
Journal:  Soc Cogn Affect Neurosci       Date:  2015-06-30       Impact factor: 3.436

2.  The Mandarin Chinese auditory emotions stimulus database: A validated set of Chinese pseudo-sentences.

Authors:  Bingyan Gong; Na Li; Qiuhong Li; Xinyuan Yan; Jing Chen; Liang Li; Xihong Wu; Chao Wu
Journal:  Behav Res Methods       Date:  2022-05-31

3.  Elucidating unconscious processing with instrumental hypnosis.

Authors:  Mathieu Landry; Krystèle Appourchaux; Amir Raz
Journal:  Front Psychol       Date:  2014-07-28

4.  Interaction between valence of empathy and familiarity: is it difficult to empathize with the positive events of a stranger?

Authors:  Yuki Motomura; Akira Takeshita; Yuka Egashira; Takayuki Nishimura; Yeon-Kyu Kim; Shigeki Watanuki
Journal:  J Physiol Anthropol       Date:  2015-03-22       Impact factor: 2.867

5.  Neural measures of the role of affective prosody in empathy for pain.

Authors:  Federica Meconi; Mattia Doro; Arianna Schiano Lomoriello; Giulia Mastrella; Paola Sessa
Journal:  Sci Rep       Date:  2018-01-10       Impact factor: 4.379

6.  Unconscious Processing of Facial Emotional Valence Relation: Behavioral Evidence of Integration between Subliminally Perceived Stimuli.

Authors:  Chengzhen Liu; Zhiyi Sun; Jerwen Jou; Qian Cui; Guang Zhao; Jiang Qiu; Shen Tu
Journal:  PLoS One       Date:  2016-09-13       Impact factor: 3.240

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.