Literature DB >> 20350188

What iconic gesture fragments reveal about gesture-speech integration: when synchrony is lost, memory can help.

Christian Obermeier1, Henning Holle, Thomas C Gunter.   

Abstract

The present series of experiments explores several issues related to gesture-speech integration and synchrony during sentence processing. To be able to more precisely manipulate gesture-speech synchrony, we used gesture fragments instead of complete gestures, thereby avoiding the usual long temporal overlap of gestures with their coexpressive speech. In a pretest, the minimal duration of an iconic gesture fragment needed to disambiguate a homonym (i.e., disambiguation point) was therefore identified. In three subsequent ERP experiments, we then investigated whether the gesture information available at the disambiguation point has immediate as well as delayed consequences on the processing of a temporarily ambiguous spoken sentence, and whether these gesture-speech integration processes are susceptible to temporal synchrony. Experiment 1, which used asynchronous stimuli as well as an explicit task, showed clear N400 effects at the homonym as well as at the target word presented further downstream, suggesting that asynchrony does not prevent integration under explicit task conditions. No such effects were found when asynchronous stimuli were presented using a more shallow task (Experiment 2). Finally, when gesture fragment and homonym were synchronous, similar results as in Experiment 1 were found, even under shallow task conditions (Experiment 3). We conclude that when iconic gesture fragments and speech are in synchrony, their interaction is more or less automatic. When they are not, more controlled, active memory processes are necessary to be able to combine the gesture fragment and speech context in such a way that the homonym is disambiguated correctly.

Entities:  

Mesh:

Year:  2010        PMID: 20350188     DOI: 10.1162/jocn.2010.21498

Source DB:  PubMed          Journal:  J Cogn Neurosci        ISSN: 0898-929X            Impact factor:   3.225


  12 in total

1.  Beyond words: evidence for automatic language-gesture integration of symbolic gestures but not dynamic landscapes.

Authors:  Dana Vainiger; Ludovica Labruna; Richard B Ivry; Michal Lavidor
Journal:  Psychol Res       Date:  2013-01-10

Review 2.  Hearing and seeing meaning in speech and gesture: insights from brain and behaviour.

Authors:  Aslı Özyürek
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2014-09-19       Impact factor: 6.237

3.  A speaker's gesture style can affect language comprehension: ERP evidence from gesture-speech integration.

Authors:  Christian Obermeier; Spencer D Kelly; Thomas C Gunter
Journal:  Soc Cogn Affect Neurosci       Date:  2015-02-16       Impact factor: 3.436

4.  Gesture-Speech Integration in Typical and Atypical Adolescent Readers.

Authors:  Ru Yao; Connie Qun Guan; Elaine R Smolen; Brian MacWhinney; Wanjin Meng; Laura M Morett
Journal:  Front Psychol       Date:  2022-06-03

5.  TMS Reveals Dynamic Interaction between Inferior Frontal Gyrus and Posterior Middle Temporal Gyrus in Gesture-Speech Semantic Integration.

Authors:  Wanying Zhao; Yanchang Li; Yi Du
Journal:  J Neurosci       Date:  2021-11-16       Impact factor: 6.709

6.  Aging and working memory modulate the ability to benefit from visible speech and iconic gestures during speech-in-noise comprehension.

Authors:  Louise Schubotz; Judith Holler; Linda Drijvers; Aslı Özyürek
Journal:  Psychol Res       Date:  2020-07-05

7.  Gesture facilitates the syntactic analysis of speech.

Authors:  Henning Holle; Christian Obermeier; Maren Schmidt-Kassow; Angela D Friederici; Jamie Ward; Thomas C Gunter
Journal:  Front Psychol       Date:  2012-03-19

8.  N400 amplitude, latency, and variability reflect temporal integration of beat gesture and pitch accent during language processing.

Authors:  Laura M Morett; Nicole Landi; Julia Irwin; James C McPartland
Journal:  Brain Res       Date:  2020-08-17       Impact factor: 3.610

9.  Inconsistent use of gesture space during abstract pointing impairs language comprehension.

Authors:  Thomas C Gunter; J E Douglas Weinbrenner; Henning Holle
Journal:  Front Psychol       Date:  2015-02-09

10.  A supramodal neural network for speech and gesture semantics: an fMRI study.

Authors:  Benjamin Straube; Antonia Green; Susanne Weis; Tilo Kircher
Journal:  PLoS One       Date:  2012-11-30       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.