Literature DB >> 19864557

Dual neural routing of visual facilitation in speech processing.

Luc H Arnal1, Benjamin Morillon, Christian A Kell, Anne-Lise Giraud.   

Abstract

Viewing our interlocutor facilitates speech perception, unlike for instance when we telephone. Several neural routes and mechanisms could account for this phenomenon. Using magnetoencephalography, we show that when seeing the interlocutor, latencies of auditory responses (M100) are the shorter the more predictable speech is from visual input, whether the auditory signal was congruent or not. Incongruence of auditory and visual input affected auditory responses approximately 20 ms after latency shortening was detected, indicating that initial content-dependent auditory facilitation by vision is followed by a feedback signal that reflects the error between expected and received auditory input (prediction error). We then used functional magnetic resonance imaging and confirmed that distinct routes of visual information to auditory processing underlie these two functional mechanisms. Functional connectivity between visual motion and auditory areas depended on the degree of visual predictability, whereas connectivity between the superior temporal sulcus and both auditory and visual motion areas was driven by audiovisual (AV) incongruence. These results establish two distinct mechanisms by which the brain uses potentially predictive visual information to improve auditory perception. A fast direct corticocortical pathway conveys visual motion parameters to auditory cortex, and a slower and indirect feedback pathway signals the error between visual prediction and auditory input.

Mesh:

Year:  2009        PMID: 19864557      PMCID: PMC6665008          DOI: 10.1523/JNEUROSCI.3194-09.2009

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  78 in total

1.  Neurophysiological origin of human brain asymmetry for speech and language.

Authors:  Benjamin Morillon; Katia Lehongre; Richard S J Frackowiak; Antoine Ducorps; Andreas Kleinschmidt; David Poeppel; Anne-Lise Giraud
Journal:  Proc Natl Acad Sci U S A       Date:  2010-10-18       Impact factor: 11.205

2.  Multistage audiovisual integration of speech: dissociating identification and detection.

Authors:  Kasper Eskelund; Jyrki Tuomainen; Tobias S Andersen
Journal:  Exp Brain Res       Date:  2010-12-25       Impact factor: 1.972

3.  Dynamic faces speed up the onset of auditory cortical spiking responses during vocal detection.

Authors:  Chandramouli Chandrasekaran; Luis Lemus; Asif A Ghazanfar
Journal:  Proc Natl Acad Sci U S A       Date:  2013-11-11       Impact factor: 11.205

Review 4.  Facial expressions and the evolution of the speech rhythm.

Authors:  Asif A Ghazanfar; Daniel Y Takahashi
Journal:  J Cogn Neurosci       Date:  2014-01-23       Impact factor: 3.225

5.  Transitions in neural oscillations reflect prediction errors generated in audiovisual speech.

Authors:  Luc H Arnal; Valentin Wyart; Anne-Lise Giraud
Journal:  Nat Neurosci       Date:  2011-05-08       Impact factor: 24.884

6.  Early and late beta-band power reflect audiovisual perception in the McGurk illusion.

Authors:  Yadira Roa Romero; Daniel Senkowski; Julian Keil
Journal:  J Neurophysiol       Date:  2015-01-07       Impact factor: 2.714

7.  Left dorsal speech stream components and their contribution to phonological processing.

Authors:  Takenobu Murakami; Christian A Kell; Julia Restle; Yoshikazu Ugawa; Ulf Ziemann
Journal:  J Neurosci       Date:  2015-01-28       Impact factor: 6.167

8.  Cross-modal prediction in speech depends on prior linguistic experience.

Authors:  Carolina Sánchez-García; James T Enns; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2013-02-06       Impact factor: 1.972

9.  Auditory, Visual and Audiovisual Speech Processing Streams in Superior Temporal Sulcus.

Authors:  Jonathan H Venezia; Kenneth I Vaden; Feng Rong; Dale Maddox; Kourosh Saberi; Gregory Hickok
Journal:  Front Hum Neurosci       Date:  2017-04-07       Impact factor: 3.169

10.  Auditory cortex tracks both auditory and visual stimulus dynamics using low-frequency neuronal phase modulation.

Authors:  Huan Luo; Zuxiang Liu; David Poeppel
Journal:  PLoS Biol       Date:  2010-08-10       Impact factor: 8.029

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.