Literature DB >> 21813693

Physical and perceptual factors shape the neural mechanisms that integrate audiovisual signals in speech comprehension.

HweeLing Lee1, Uta Noppeney.   

Abstract

Face-to-face communication challenges the human brain to integrate information from auditory and visual senses with linguistic representations. Yet the role of bottom-up physical (spectrotemporal structure) input and top-down linguistic constraints in shaping the neural mechanisms specialized for integrating audiovisual speech signals are currently unknown. Participants were presented with speech and sinewave speech analogs in visual, auditory, and audiovisual modalities. Before the fMRI study, they were trained to perceive physically identical sinewave speech analogs as speech (SWS-S) or nonspeech (SWS-N). Comparing audiovisual integration (interactions) of speech, SWS-S, and SWS-N revealed a posterior-anterior processing gradient within the left superior temporal sulcus/gyrus (STS/STG): Bilateral posterior STS/STG integrated audiovisual inputs regardless of spectrotemporal structure or speech percept; in left mid-STS, the integration profile was primarily determined by the spectrotemporal structure of the signals; more anterior STS regions discarded spectrotemporal structure and integrated audiovisual signals constrained by stimulus intelligibility and the availability of linguistic representations. In addition to this "ventral" processing stream, a "dorsal" circuitry encompassing posterior STS/STG and left inferior frontal gyrus differentially integrated audiovisual speech and SWS signals. Indeed, dynamic causal modeling and Bayesian model comparison provided strong evidence for a parallel processing structure encompassing a ventral and a dorsal stream with speech intelligibility training enhancing the connectivity between posterior and anterior STS/STG. In conclusion, audiovisual speech comprehension emerges in an interactive process with the integration of auditory and visual signals being progressively constrained by stimulus intelligibility along the STS and spectrotemporal structure in a dorsal fronto-temporal circuitry.

Entities:  

Mesh:

Substances:

Year:  2011        PMID: 21813693      PMCID: PMC6623363          DOI: 10.1523/JNEUROSCI.6510-10.2011

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  24 in total

1.  Electrocorticography reveals continuous auditory and visual speech tracking in temporal and occipital cortex.

Authors:  Cristiano Micheli; Inga M Schepers; Müge Ozker; Daniel Yoshor; Michael S Beauchamp; Jochem W Rieger
Journal:  Eur J Neurosci       Date:  2018-08-12       Impact factor: 3.386

2.  Sensory and striatal areas integrate auditory and visual signals into behavioral benefits during motion discrimination.

Authors:  Sebastian von Saldern; Uta Noppeney
Journal:  J Neurosci       Date:  2013-05-15       Impact factor: 6.167

3.  Adaptive benefit of cross-modal plasticity following cochlear implantation in deaf adults.

Authors:  Carly A Anderson; Ian M Wiggins; Pádraig T Kitterick; Douglas E H Hartley
Journal:  Proc Natl Acad Sci U S A       Date:  2017-08-14       Impact factor: 11.205

4.  Causal Inference in Audiovisual Perception.

Authors:  Agoston Mihalik; Uta Noppeney
Journal:  J Neurosci       Date:  2020-07-15       Impact factor: 6.167

5.  Electrocorticography Reveals Enhanced Visual Cortex Responses to Visual Speech.

Authors:  Inga M Schepers; Daniel Yoshor; Michael S Beauchamp
Journal:  Cereb Cortex       Date:  2014-06-05       Impact factor: 5.357

6.  A Double Dissociation between Anterior and Posterior Superior Temporal Gyrus for Processing Audiovisual Speech Demonstrated by Electrocorticography.

Authors:  Muge Ozker; Inga M Schepers; John F Magnotti; Daniel Yoshor; Michael S Beauchamp
Journal:  J Cogn Neurosci       Date:  2017-03-02       Impact factor: 3.225

7.  Prefrontal neuronal responses during audiovisual mnemonic processing.

Authors:  Jaewon Hwang; Lizabeth M Romanski
Journal:  J Neurosci       Date:  2015-01-21       Impact factor: 6.167

8.  Shared and modality-specific brain regions that mediate auditory and visual word comprehension.

Authors:  Anne Keitel; Joachim Gross; Christoph Kayser
Journal:  Elife       Date:  2020-08-24       Impact factor: 8.140

9.  Gated audiovisual speech identification in silence vs. noise: effects on time and accuracy.

Authors:  Shahram Moradi; Björn Lidestam; Jerker Rönnberg
Journal:  Front Psychol       Date:  2013-06-19

10.  New levels of language processing complexity and organization revealed by granger causation.

Authors:  David W Gow; David N Caplan
Journal:  Front Psychol       Date:  2012-11-19
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.