Literature DB >> 23797145

Visual information constrains early and late stages of spoken-word recognition in sentence context.

Angèle Brunellière1, Carolina Sánchez-García, Nara Ikumi, Salvador Soto-Faraco.   

Abstract

Audiovisual speech perception has been frequently studied considering phoneme, syllable and word processing levels. Here, we examined the constraints that visual speech information might exert during the recognition of words embedded in a natural sentence context. We recorded event-related potentials (ERPs) to words that could be either strongly or weakly predictable on the basis of the prior semantic sentential context and, whose initial phoneme varied in the degree of visual saliency from lip movements. When the sentences were presented audio-visually (Experiment 1), words weakly predicted from semantic context elicited a larger long-lasting N400, compared to strongly predictable words. This semantic effect interacted with the degree of visual saliency over a late part of the N400. When comparing audio-visual versus auditory alone presentation (Experiment 2), the typical amplitude-reduction effect over the auditory-evoked N100 response was observed in the audiovisual modality. Interestingly, a specific benefit of high- versus low-visual saliency constraints occurred over the early N100 response and at the late N400 time window, confirming the result of Experiment 1. Taken together, our results indicate that the saliency of visual speech can exert an influence over both auditory processing and word recognition at relatively late stages, and thus suggest strong interactivity between audio-visual integration and other (arguably higher) stages of information processing during natural speech comprehension.
Copyright © 2013 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Event-related potentials; Semantic constraints; Spoken-word recognition; Visual speech

Mesh:

Year:  2013        PMID: 23797145     DOI: 10.1016/j.ijpsycho.2013.06.016

Source DB:  PubMed          Journal:  Int J Psychophysiol        ISSN: 0167-8760            Impact factor:   2.997


  5 in total

1.  More than words: word predictability, prosody, gesture and mouth movements in natural language comprehension.

Authors:  Ye Zhang; Diego Frassinelli; Jyrki Tuomainen; Jeremy I Skipper; Gabriella Vigliocco
Journal:  Proc Biol Sci       Date:  2021-07-21       Impact factor: 5.349

2.  Rhythm on Your Lips.

Authors:  Marcela Peña; Alan Langus; César Gutiérrez; Daniela Huepe-Artigas; Marina Nespor
Journal:  Front Psychol       Date:  2016-11-08

3.  Combined predictive effects of sentential and visual constraints in early audiovisual speech processing.

Authors:  Heidi Solberg Økland; Ana Todorović; Claudia S Lüttke; James M McQueen; Floris P de Lange
Journal:  Sci Rep       Date:  2019-05-27       Impact factor: 4.379

4.  Effect of attentional load on audiovisual speech perception: evidence from ERPs.

Authors:  Agnès Alsius; Riikka Möttönen; Mikko E Sams; Salvador Soto-Faraco; Kaisa Tiippana
Journal:  Front Psychol       Date:  2014-07-15

5.  Electrophysiological Dynamics of Visual Speech Processing and the Role of Orofacial Effectors for Cross-Modal Predictions.

Authors:  Maëva Michon; Gonzalo Boncompte; Vladimir López
Journal:  Front Hum Neurosci       Date:  2020-10-27       Impact factor: 3.169

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.