Literature DB >> 31577522

Engaging the Articulators Enhances Perception of Concordant Visible Speech Movements.

Matthew Masapollo1, Frank H Guenther1,2.   

Abstract

Purpose This study aimed to test whether (and how) somatosensory feedback signals from the vocal tract affect concurrent unimodal visual speech perception. Method Participants discriminated pairs of silent visual utterances of vowels under 3 experimental conditions: (a) normal (baseline) and while holding either (b) a bite block or (c) a lip tube in their mouths. To test the specificity of somatosensory-visual interactions during perception, we assessed discrimination of vowel contrasts optically distinguished based on their mandibular (English /ɛ/-/æ/) or labial (English /u/-French /u/) postures. In addition, we assessed perception of each contrast using dynamically articulating videos and static (single-frame) images of each gesture (at vowel midpoint). Results Engaging the jaw selectively facilitated perception of the dynamic gestures optically distinct in terms of jaw height, whereas engaging the lips selectively facilitated perception of the dynamic gestures optically distinct in terms of their degree of lip compression and protrusion. Thus, participants perceived visible speech movements in relation to the configuration and shape of their own vocal tract (and possibly their ability to produce covert vowel production-like movements). In contrast, engaging the articulators had no effect when the speaking faces did not move, suggesting that the somatosensory inputs affected perception of time-varying kinematic information rather than changes in target (movement end point) mouth shapes. Conclusions These findings suggest that orofacial somatosensory inputs associated with speech production prime premotor and somatosensory brain regions involved in the sensorimotor control of speech, thereby facilitating perception of concordant visible speech movements. Supplemental Material https://doi.org/10.23641/asha.9911846.

Entities:  

Year:  2019        PMID: 31577522     DOI: 10.1044/2019_JSLHR-S-19-0167

Source DB:  PubMed          Journal:  J Speech Lang Hear Res        ISSN: 1092-4388            Impact factor:   2.297


  3 in total

1.  When Additional Training Isn't Enough: Further Evidence That Unpredictable Speech Inhibits Adaptation.

Authors:  Kaitlin L Lansford; Stephanie A Borrie; Tyson S Barrett; Cassidy Flechaus
Journal:  J Speech Lang Hear Res       Date:  2020-05-20       Impact factor: 2.297

2.  Neural indicators of articulator-specific sensorimotor influences on infant speech perception.

Authors:  Dawoon Choi; Ghislaine Dehaene-Lambertz; Marcela Peña; Janet F Werker
Journal:  Proc Natl Acad Sci U S A       Date:  2021-05-18       Impact factor: 11.205

3.  Neurophysiological Correlates of Asymmetries in Vowel Perception: An English-French Cross-Linguistic Event-Related Potential Study.

Authors:  Linda Polka; Monika Molnar; T Christina Zhao; Matthew Masapollo
Journal:  Front Hum Neurosci       Date:  2021-06-03       Impact factor: 3.473

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.