Literature DB >> 24635187

Synchronization of speech and gesture: evidence for interaction in action.

Mingyuan Chu1, Peter Hagoort1.   

Abstract

Language and action systems are highly interlinked. A critical piece of evidence is that speech and its accompanying gestures are tightly synchronized. Five experiments were conducted to test 2 hypotheses about the synchronization of speech and gesture. According to the interactive view, there is continuous information exchange between the gesture and speech systems, during both their planning and execution phases. According to the ballistic view, information exchange occurs only during the planning phases of gesture and speech, but the 2 systems become independent once their execution has been initiated. In all experiments, participants were required to point to and/or name a light that had just lit up. Virtual reality and motion tracking technologies were used to disrupt their gesture or speech execution. Participants delayed their speech onset when their gesture was disrupted. They did so even when their gesture was disrupted at its late phase and even when they received only the kinesthetic feedback of their gesture. Also, participants prolonged their gestures when their speech was disrupted. These findings support the interactive view and add new constraints on models of speech and gesture production. PsycINFO Database Record (c) 2014 APA, all rights reserved.

Entities:  

Mesh:

Year:  2014        PMID: 24635187     DOI: 10.1037/a0036281

Source DB:  PubMed          Journal:  J Exp Psychol Gen        ISSN: 0022-1015


  7 in total

Review 1.  Virtual reality: A game-changing method for the language sciences.

Authors:  David Peeters
Journal:  Psychon Bull Rev       Date:  2019-06

2.  Entrainment and Modulation of Gesture-Speech Synchrony Under Delayed Auditory Feedback.

Authors:  Wim Pouw; James A Dixon
Journal:  Cogn Sci       Date:  2019-03

3.  Effects of Scale on Multimodal Deixis: Evidence From Quiahije Chatino.

Authors:  Kate Mesh; Emiliana Cruz; Joost van de Weijer; Niclas Burenhult; Marianne Gullberg
Journal:  Front Psychol       Date:  2021-01-12

4.  The importance of visual control and biomechanics in the regulation of gesture-speech synchrony for an individual deprived of proprioceptive feedback of body position.

Authors:  Wim Pouw; Steven J Harrison; James A Dixon
Journal:  Sci Rep       Date:  2022-08-30       Impact factor: 4.996

5.  Hierarchical Integration of Communicative and Spatial Perspective-Taking Demands in Sensorimotor Control of Referential Pointing.

Authors:  Rui 睿 Liu 刘; Sara Bögels; Geoffrey Bird; W Pieter Medendorp; Ivan Toni
Journal:  Cogn Sci       Date:  2022-01

6.  Acoustic information about upper limb movement in voicing.

Authors:  Wim Pouw; Alexandra Paxton; Steven J Harrison; James A Dixon
Journal:  Proc Natl Acad Sci U S A       Date:  2020-05-11       Impact factor: 11.205

7.  The quantification of gesture-speech synchrony: A tutorial and validation of multimodal data acquisition using device-based and video-based motion tracking.

Authors:  Wim Pouw; James P Trujillo; James A Dixon
Journal:  Behav Res Methods       Date:  2020-04
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.