Literature DB >> 26892858

Hand gestures as visual prosody: BOLD responses to audio-visual alignment are modulated by the communicative nature of the stimuli.

Emmanuel Biau1, Luis Morís Fernández2, Henning Holle3, César Avila4, Salvador Soto-Faraco5.   

Abstract

During public addresses, speakers accompany their discourse with spontaneous hand gestures (beats) that are tightly synchronized with the prosodic contour of the discourse. It has been proposed that speech and beat gestures originate from a common underlying linguistic process whereby both speech prosody and beats serve to emphasize relevant information. We hypothesized that breaking the consistency between beats and prosody by temporal desynchronization, would modulate activity of brain areas sensitive to speech-gesture integration. To this aim, we measured BOLD responses as participants watched a natural discourse where the speaker used beat gestures. In order to identify brain areas specifically involved in processing hand gestures with communicative intention, beat synchrony was evaluated against arbitrary visual cues bearing equivalent rhythmic and spatial properties as the gestures. Our results revealed that left MTG and IFG were specifically sensitive to speech synchronized with beats, compared to the arbitrary vision-speech pairing. Our results suggest that listeners confer beats a function of visual prosody, complementary to the prosodic structure of speech. We conclude that the emphasizing function of beat gestures in speech perception is instantiated through a specialized brain network sensitive to the communicative intent conveyed by a speaker with his/her hands.
Copyright © 2016. Published by Elsevier Inc.

Entities:  

Keywords:  Audiovisual speech; Gestures; MTG; Multisensory Integration; Speech perception; fMRI

Mesh:

Year:  2016        PMID: 26892858     DOI: 10.1016/j.neuroimage.2016.02.018

Source DB:  PubMed          Journal:  Neuroimage        ISSN: 1053-8119            Impact factor:   6.556


  9 in total

1.  Meta-Analyses Support a Taxonomic Model for Representations of Different Categories of Audio-Visual Interaction Events in the Human Brain.

Authors:  Matt Csonka; Nadia Mardmomen; Paula J Webster; Julie A Brefczynski-Lewis; Chris Frum; James W Lewis
Journal:  Cereb Cortex Commun       Date:  2021-01-18

2.  Left Motor δ Oscillations Reflect Asynchrony Detection in Multisensory Speech Perception.

Authors:  Emmanuel Biau; Benjamin G Schultz; Thomas C Gunter; Sonja A Kotz
Journal:  J Neurosci       Date:  2022-01-27       Impact factor: 6.709

3.  N400 amplitude, latency, and variability reflect temporal integration of beat gesture and pitch accent during language processing.

Authors:  Laura M Morett; Nicole Landi; Julia Irwin; James C McPartland
Journal:  Brain Res       Date:  2020-08-17       Impact factor: 3.610

Review 4.  Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech.

Authors:  Emmanuel Biau; Sonja A Kotz
Journal:  Front Hum Neurosci       Date:  2018-10-24       Impact factor: 3.169

5.  Interpretation of Social Interactions: Functional Imaging of Cognitive-Semiotic Categories During Naturalistic Viewing.

Authors:  Dhana Wolf; Irene Mittelberg; Linn-Marlen Rekittke; Saurabh Bhavsar; Mikhail Zvyagintsev; Annina Haeck; Fengyu Cong; Martin Klasen; Klaus Mathiak
Journal:  Front Hum Neurosci       Date:  2018-08-14       Impact factor: 3.169

6.  Auditory detection is modulated by theta phase of silent lip movements.

Authors:  Emmanuel Biau; Danying Wang; Hyojin Park; Ole Jensen; Simon Hanslmayr
Journal:  Curr Res Neurobiol       Date:  2021-06-12

7.  Perceived Conventionality in Co-speech Gestures Involves the Fronto-Temporal Language Network.

Authors:  Dhana Wolf; Linn-Marlen Rekittke; Irene Mittelberg; Martin Klasen; Klaus Mathiak
Journal:  Front Hum Neurosci       Date:  2017-11-30       Impact factor: 3.169

Review 8.  Prosody in the Auditory and Visual Domains: A Developmental Perspective.

Authors:  Núria Esteve-Gibert; Bahia Guellaï
Journal:  Front Psychol       Date:  2018-03-19

9.  Why We Should Study Multimodal Language.

Authors:  Pamela Perniss
Journal:  Front Psychol       Date:  2018-06-28
  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.