Literature DB >> 26072361

Articulatory constraints on spontaneous entrainment between speech and manual gesture.

Gregory Zelic1, Jeesun Kim2, Chris Davis2.   

Abstract

The present study examined the extent to which speech and manual gestures spontaneously entrain in a non-communicative task. Participants had to repeatedly utter nonsense /CV/ syllables while continuously moving the right index finger in flexion/extension. No instructions to coordinate were given. We manipulated the type of syllable uttered (/ba/ vs. /sa/), and vocalization (phonated vs. silent speech). Assuming principles of coordination dynamics, a stronger entrainment between the fingers oscillations and the jaw motion was predicted (1) for /ba/, due to expected larger amplitude of jaw motion and (2) in phonated speech, due to the auditory feedback. Fifteen out of twenty participants showed simple ratios of speech to finger cycles (1:1, 1:2 or 2:1). In contrast with our predictions, speech-gesture entrainment was stronger when vocalizing /sa/ than /ba/, also more widely distributed on an in-phase mode. Furthermore, results revealed a spatial anchoring and an increased temporal variability in jaw motion when producing /sa/. We suggested that this indicates a greater control of the speech articulators for /sa/, making the speech performance more receptive to environmental forces, resulting in the greater entrainment observed to gesture oscillations. The speech-gesture coordination was maintained in silent speech, suggesting a somatosensory basis for their endogenous coupling.
Copyright © 2015 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Rhythmic coordination dynamics; Speech articulatory constraints; Spontaneous entrainment processes

Mesh:

Year:  2015        PMID: 26072361     DOI: 10.1016/j.humov.2015.05.009

Source DB:  PubMed          Journal:  Hum Mov Sci        ISSN: 0167-9457            Impact factor:   2.161


  3 in total

1.  Perceptuo-motor compatibility governs multisensory integration in bimanual coordination dynamics.

Authors:  Gregory Zelic; Denis Mottet; Julien Lagarde
Journal:  Exp Brain Res       Date:  2015-11-02       Impact factor: 1.972

2.  Entrainment and Modulation of Gesture-Speech Synchrony Under Delayed Auditory Feedback.

Authors:  Wim Pouw; James A Dixon
Journal:  Cogn Sci       Date:  2019-03

3.  The quantification of gesture-speech synchrony: A tutorial and validation of multimodal data acquisition using device-based and video-based motion tracking.

Authors:  Wim Pouw; James P Trujillo; James A Dixon
Journal:  Behav Res Methods       Date:  2020-04
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.