Literature DB >> 26490860

Congruent Visual Speech Enhances Cortical Entrainment to Continuous Auditory Speech in Noise-Free Conditions.

Michael J Crosse1, John S Butler2, Edmund C Lalor3.   

Abstract

Congruent audiovisual speech enhances our ability to comprehend a speaker, even in noise-free conditions. When incongruent auditory and visual information is presented concurrently, it can hinder a listener's perception and even cause him or her to perceive information that was not presented in either modality. Efforts to investigate the neural basis of these effects have often focused on the special case of discrete audiovisual syllables that are spatially and temporally congruent, with less work done on the case of natural, continuous speech. Recent electrophysiological studies have demonstrated that cortical response measures to continuous auditory speech can be easily obtained using multivariate analysis methods. Here, we apply such methods to the case of audiovisual speech and, importantly, present a novel framework for indexing multisensory integration in the context of continuous speech. Specifically, we examine how the temporal and contextual congruency of ongoing audiovisual speech affects the cortical encoding of the speech envelope in humans using electroencephalography. We demonstrate that the cortical representation of the speech envelope is enhanced by the presentation of congruent audiovisual speech in noise-free conditions. Furthermore, we show that this is likely attributable to the contribution of neural generators that are not particularly active during unimodal stimulation and that it is most prominent at the temporal scale corresponding to syllabic rate (2-6 Hz). Finally, our data suggest that neural entrainment to the speech envelope is inhibited when the auditory and visual streams are incongruent both temporally and contextually. SIGNIFICANCE STATEMENT: Seeing a speaker's face as he or she talks can greatly help in understanding what the speaker is saying. This is because the speaker's facial movements relay information about what the speaker is saying, but also, importantly, when the speaker is saying it. Studying how the brain uses this timing relationship to combine information from continuous auditory and visual speech has traditionally been methodologically difficult. Here we introduce a new approach for doing this using relatively inexpensive and noninvasive scalp recordings. Specifically, we show that the brain's representation of auditory speech is enhanced when the accompanying visual speech signal shares the same timing. Furthermore, we show that this enhancement is most pronounced at a time scale that corresponds to mean syllable length.
Copyright © 2015 the authors 0270-6474/15/3514195-10$15.00/0.

Entities:  

Keywords:  EEG; audiovisual speech; multisensory integration; stimulus reconstruction; temporal coherence; temporal response function

Mesh:

Year:  2015        PMID: 26490860      PMCID: PMC6605423          DOI: 10.1523/JNEUROSCI.1829-15.2015

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  59 in total

1.  Processing of changes in visual speech in the human auditory cortex.

Authors:  Riikka Möttönen; Christina M Krause; Kaisa Tiippana; Mikko Sams
Journal:  Brain Res Cogn Brain Res       Date:  2002-05

2.  Speech comprehension is correlated with temporal response patterns recorded from auditory cortex.

Authors:  E Ahissar; S Nagarajan; M Ahissar; A Protopapas; H Mahncke; M M Merzenich
Journal:  Proc Natl Acad Sci U S A       Date:  2001-11-06       Impact factor: 11.205

3.  The use of visible speech cues for improving auditory detection of spoken sentences.

Authors:  K W Grant; P F Seitz
Journal:  J Acoust Soc Am       Date:  2000-09       Impact factor: 1.840

4.  Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception.

Authors:  Vasily Klucharev; Riikka Möttönen; Mikko Sams
Journal:  Brain Res Cogn Brain Res       Date:  2003-12

Review 5.  Temporal information in speech: acoustic, auditory and linguistic aspects.

Authors:  S Rosen
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  1992-06-29       Impact factor: 6.237

Review 6.  Lipreading and audio-visual speech perception.

Authors:  Q Summerfield
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  1992-01-29       Impact factor: 6.237

7.  Integration of auditory and visual information about objects in superior temporal sulcus.

Authors:  Michael S Beauchamp; Kathryn E Lee; Brenna D Argall; Alex Martin
Journal:  Neuron       Date:  2004-03-04       Impact factor: 17.173

8.  Hearing lips and seeing voices.

Authors:  H McGurk; J MacDonald
Journal:  Nature       Date:  1976 Dec 23-30       Impact factor: 49.962

9.  Statistical facilitation of simple reaction times.

Authors:  D H RAAB
Journal:  Trans N Y Acad Sci       Date:  1962-03

10.  Reading speech from still and moving faces: the neural substrates of visible speech.

Authors:  Gemma A Calvert; Ruth Campbell
Journal:  J Cogn Neurosci       Date:  2003-01-01       Impact factor: 3.225

View more
  42 in total

1.  Electrocorticography reveals continuous auditory and visual speech tracking in temporal and occipital cortex.

Authors:  Cristiano Micheli; Inga M Schepers; Müge Ozker; Daniel Yoshor; Michael S Beauchamp; Jochem W Rieger
Journal:  Eur J Neurosci       Date:  2018-08-12       Impact factor: 3.386

Review 2.  Machine Learning Approaches to Analyze Speech-Evoked Neurophysiological Responses.

Authors:  Zilong Xie; Rachel Reetzke; Bharath Chandrasekaran
Journal:  J Speech Lang Hear Res       Date:  2019-03-25       Impact factor: 2.297

3.  Vision perceptually restores auditory spectral dynamics in speech.

Authors:  John Plass; David Brang; Satoru Suzuki; Marcia Grabowecky
Journal:  Proc Natl Acad Sci U S A       Date:  2020-07-06       Impact factor: 11.205

4.  Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.

Authors:  Michael J Crosse; Giovanni M Di Liberto; Edmund C Lalor
Journal:  J Neurosci       Date:  2016-09-21       Impact factor: 6.167

5.  Anomalous network architecture of the resting brain in children who stutter.

Authors:  Soo-Eun Chang; Michael Angstadt; Ho Ming Chow; Andrew C Etchell; Emily O Garnett; Ai Leen Choo; Daniel Kessler; Robert C Welsh; Chandra Sripada
Journal:  J Fluency Disord       Date:  2017-01-25       Impact factor: 2.538

6.  Visual cortex entrains to sign language.

Authors:  Geoffrey Brookshire; Jenny Lu; Howard C Nusbaum; Susan Goldin-Meadow; Daniel Casasanto
Journal:  Proc Natl Acad Sci U S A       Date:  2017-05-30       Impact factor: 11.205

7.  Envelope reconstruction of speech and music highlights stronger tracking of speech at low frequencies.

Authors:  Nathaniel J Zuk; Jeremy W Murphy; Richard B Reilly; Edmund C Lalor
Journal:  PLoS Comput Biol       Date:  2021-09-17       Impact factor: 4.475

Review 8.  IFCN-endorsed practical guidelines for clinical magnetoencephalography (MEG).

Authors:  Riitta Hari; Sylvain Baillet; Gareth Barnes; Richard Burgess; Nina Forss; Joachim Gross; Matti Hämäläinen; Ole Jensen; Ryusuke Kakigi; François Mauguière; Nobukatzu Nakasato; Aina Puce; Gian-Luca Romani; Alfons Schnitzler; Samu Taulu
Journal:  Clin Neurophysiol       Date:  2018-04-17       Impact factor: 3.708

9.  Crossmodal Phase Reset and Evoked Responses Provide Complementary Mechanisms for the Influence of Visual Speech in Auditory Cortex.

Authors:  Pierre Mégevand; Manuel R Mercier; David M Groppe; Elana Zion Golumbic; Nima Mesgarani; Michael S Beauchamp; Charles E Schroeder; Ashesh D Mehta
Journal:  J Neurosci       Date:  2020-10-06       Impact factor: 6.167

10.  Using Coherence-based spectro-spatial filters for stimulus features prediction from electro-corticographic recordings.

Authors:  Jaime Delgado Saa; Andy Christen; Stephanie Martin; Brian N Pasley; Robert T Knight; Anne-Lise Giraud
Journal:  Sci Rep       Date:  2020-05-06       Impact factor: 4.379

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.