Literature DB >> 29130054

Multimodal integration of spontaneously produced representational co-speech gestures: an fMRI study.

Jill Weisberg1, Amy Lynn Hubbard2, Karen Emmorey3.   

Abstract

To examine whether more ecologically valid co-speech gesture stimuli elicit brain responses consistent with those found by studies that relied on scripted stimuli, we presented participants with spontaneously produced, meaningful co-speech gesture during fMRI scanning (n = 28). Speech presented with gesture (versus either presented alone) elicited heightened activity in bilateral posterior superior temporal, premotor, and inferior frontal regions. Within left temporal and premotor, but not inferior frontal regions, we identified small clusters with superadditive responses, suggesting that these discrete regions support both sensory and semantic integration. In contrast, surrounding areas and the inferior frontal gyrus may support either sensory or semantic integration. Reduced activation for speech with gesture in language-related regions indicates allocation of fewer neural resources when meaningful gestures accompany speech. Sign language experience did not affect co-speech gesture activation. Overall, our results indicate that scripted stimuli have minimal confounding influences; however, they may miss subtle superadditive effects.

Entities:  

Keywords:  co-speech gesture; fMRI; multimodal integration; multisensory integration; semantic integration

Year:  2016        PMID: 29130054      PMCID: PMC5675577          DOI: 10.1080/23273798.2016.1245426

Source DB:  PubMed          Journal:  Lang Cogn Neurosci        ISSN: 2327-3798            Impact factor:   2.331


  77 in total

1.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex.

Authors:  G A Calvert; R Campbell; M J Brammer
Journal:  Curr Biol       Date:  2000-06-01       Impact factor: 10.834

2.  Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information.

Authors:  Daniel E Callan; Jeffery A Jones; Kevin Munhall; Christian Kroos; Akiko M Callan; Eric Vatikiotis-Bateson
Journal:  J Cogn Neurosci       Date:  2004-06       Impact factor: 3.225

3.  Body-specific representations of action verbs: neural evidence from right- and left-handers.

Authors:  Roel M Willems; Peter Hagoort; Daniel Casasanto
Journal:  Psychol Sci       Date:  2009-11-23

4.  Statistical criteria in FMRI studies of multisensory integration.

Authors:  Michael S Beauchamp
Journal:  Neuroinformatics       Date:  2005

5.  Neural mechanisms underlying auditory feedback control of speech.

Authors:  Jason A Tourville; Kevin J Reilly; Frank H Guenther
Journal:  Neuroimage       Date:  2007-10-11       Impact factor: 6.556

6.  Differential roles for left inferior frontal and superior temporal cortex in multimodal integration of action and language.

Authors:  Roel M Willems; Asli Ozyürek; Peter Hagoort
Journal:  Neuroimage       Date:  2009-06-01       Impact factor: 6.556

7.  Neural integration of iconic and unrelated coverbal gestures: a functional MRI study.

Authors:  Antonia Green; Benjamin Straube; Susanne Weis; Andreas Jansen; Klaus Willmes; Kerstin Konrad; Tilo Kircher
Journal:  Hum Brain Mapp       Date:  2009-10       Impact factor: 5.038

8.  Giving speech a hand: gesture modulates activity in auditory cortex during speech perception.

Authors:  Amy L Hubbard; Stephen M Wilson; Daniel E Callan; Mirella Dapretto
Journal:  Hum Brain Mapp       Date:  2009-03       Impact factor: 5.038

9.  Functional organization of inferior area 6 in the macaque monkey. II. Area F5 and the control of distal movements.

Authors:  G Rizzolatti; R Camarda; L Fogassi; M Gentilucci; G Luppino; M Matelli
Journal:  Exp Brain Res       Date:  1988       Impact factor: 1.972

10.  fMR-adaptation indicates selectivity to audiovisual content congruency in distributed clusters in human superior temporal cortex.

Authors:  Nienke M van Atteveldt; Vera C Blau; Leo Blomert; Rainer Goebel
Journal:  BMC Neurosci       Date:  2010-02-02       Impact factor: 3.288

View more
  4 in total

1.  Speech-accompanying gestures are not processed by the language-processing mechanisms.

Authors:  Olessia Jouravlev; David Zheng; Zuzanna Balewski; Alvince Le Arnz Pongos; Zena Levan; Susan Goldin-Meadow; Evelina Fedorenko
Journal:  Neuropsychologia       Date:  2019-07-02       Impact factor: 3.139

2.  Audio-visual and olfactory-visual integration in healthy participants and subjects with autism spectrum disorder.

Authors:  Susanne Stickel; Pauline Weismann; Thilo Kellermann; Christina Regenbogen; Ute Habel; Jessica Freiherr; Natalya Chechko
Journal:  Hum Brain Mapp       Date:  2019-07-13       Impact factor: 5.038

3.  Multimodal and Spectral Degradation Effects on Speech and Emotion Recognition in Adult Listeners.

Authors:  Chantel Ritter; Tara Vongpaisal
Journal:  Trends Hear       Date:  2018 Jan-Dec       Impact factor: 3.293

4.  Why We Should Study Multimodal Language.

Authors:  Pamela Perniss
Journal:  Front Psychol       Date:  2018-06-28
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.