Literature DB >> 33023923

Crossmodal Phase Reset and Evoked Responses Provide Complementary Mechanisms for the Influence of Visual Speech in Auditory Cortex.

Pierre Mégevand1,2,3, Manuel R Mercier4,5,6, David M Groppe1,2,7, Elana Zion Golumbic8, Nima Mesgarani9, Michael S Beauchamp10, Charles E Schroeder11,12, Ashesh D Mehta13,2.   

Abstract

Natural conversation is multisensory: when we can see the speaker's face, visual speech cues improve our comprehension. The neuronal mechanisms underlying this phenomenon remain unclear. The two main alternatives are visually mediated phase modulation of neuronal oscillations (excitability fluctuations) in auditory neurons and visual input-evoked responses in auditory neurons. Investigating this question using naturalistic audiovisual speech with intracranial recordings in humans of both sexes, we find evidence for both mechanisms. Remarkably, auditory cortical neurons track the temporal dynamics of purely visual speech using the phase of their slow oscillations and phase-related modulations in broadband high-frequency activity. Consistent with known perceptual enhancement effects, the visual phase reset amplifies the cortical representation of concomitant auditory speech. In contrast to this, and in line with earlier reports, visual input reduces the amplitude of evoked responses to concomitant auditory input. We interpret the combination of improved phase tracking and reduced response amplitude as evidence for more efficient and reliable stimulus processing in the presence of congruent auditory and visual speech inputs.SIGNIFICANCE STATEMENT Watching the speaker can facilitate our understanding of what is being said. The mechanisms responsible for this influence of visual cues on the processing of speech remain incompletely understood. We studied these mechanisms by recording the electrical activity of the human brain through electrodes implanted surgically inside the brain. We found that visual inputs can operate by directly activating auditory cortical areas, and also indirectly by modulating the strength of cortical responses to auditory input. Our results help to understand the mechanisms by which the brain merges auditory and visual speech into a unitary perception.
Copyright © 2020 the authors.

Entities:  

Keywords:  audiovisual speech; broadband high-frequency activity; crossmodal stimuli; intracranial electroencephalography; neuronal oscillations; phase–amplitude coupling

Mesh:

Year:  2020        PMID: 33023923      PMCID: PMC7605423          DOI: 10.1523/JNEUROSCI.0555-20.2020

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  85 in total

Review 1.  Neuronal oscillations and visual amplification of speech.

Authors:  Charles E Schroeder; Peter Lakatos; Yoshinao Kajikawa; Sarah Partan; Aina Puce
Journal:  Trends Cogn Sci       Date:  2008-02-15       Impact factor: 20.229

2.  Mass univariate analysis of event-related brain potentials/fields II: Simulation studies.

Authors:  David M Groppe; Thomas P Urbach; Marta Kutas
Journal:  Psychophysiology       Date:  2011-09-06       Impact factor: 4.016

3.  Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.

Authors:  Michael J Crosse; Giovanni M Di Liberto; Edmund C Lalor
Journal:  J Neurosci       Date:  2016-09-21       Impact factor: 6.167

4.  Individualized localization and cortical surface-based registration of intracranial electrodes.

Authors:  Andrew R Dykstra; Alexander M Chan; Brian T Quinn; Rodrigo Zepeda; Corey J Keller; Justine Cormier; Joseph R Madsen; Emad N Eskandar; Sydney S Cash
Journal:  Neuroimage       Date:  2011-11-28       Impact factor: 6.556

5.  Mechanisms underlying selective neuronal tracking of attended speech at a "cocktail party".

Authors:  Elana M Zion Golumbic; Nai Ding; Stephan Bickel; Peter Lakatos; Catherine A Schevon; Guy M McKhann; Robert R Goodman; Ronald Emerson; Ashesh D Mehta; Jonathan Z Simon; David Poeppel; Charles E Schroeder
Journal:  Neuron       Date:  2013-03-06       Impact factor: 17.173

6.  Functional mapping of human sensorimotor cortex with electrocorticographic spectral analysis. II. Event-related synchronization in the gamma band.

Authors:  N E Crone; D L Miglioretti; B Gordon; R P Lesser
Journal:  Brain       Date:  1998-12       Impact factor: 13.501

7.  The leading sense: supramodal control of neurophysiological context by attention.

Authors:  Peter Lakatos; Monica N O'Connell; Annamaria Barczak; Aimee Mills; Daniel C Javitt; Charles E Schroeder
Journal:  Neuron       Date:  2009-11-12       Impact factor: 17.173

8.  Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features.

Authors:  Anne Keitel; Joachim Gross; Christoph Kayser
Journal:  PLoS Biol       Date:  2018-03-12       Impact factor: 8.029

9.  Left Superior Temporal Gyrus Is Coupled to Attended Speech in a Cocktail-Party Auditory Scene.

Authors:  Marc Vander Ghinst; Mathieu Bourguignon; Marc Op de Beeck; Vincent Wens; Brice Marty; Sergio Hassid; Georges Choufani; Veikko Jousmäki; Riitta Hari; Patrick Van Bogaert; Serge Goldman; Xavier De Tiège
Journal:  J Neurosci       Date:  2016-02-03       Impact factor: 6.167

10.  Convergent evolution of face spaces across human face-selective neuronal groups and deep convolutional networks.

Authors:  Shany Grossman; Guy Gaziv; Erin M Yeagle; Michal Harel; Pierre Mégevand; David M Groppe; Simon Khuvis; Jose L Herrero; Michal Irani; Ashesh D Mehta; Rafael Malach
Journal:  Nat Commun       Date:  2019-10-30       Impact factor: 14.919

View more
  4 in total

1.  The phase of cortical oscillations determines the perceptual fate of visual cues in naturalistic audiovisual speech.

Authors:  Raphaël Thézé; Anne-Lise Giraud; Pierre Mégevand
Journal:  Sci Adv       Date:  2020-11-04       Impact factor: 14.136

Review 2.  Faces and Voices Processing in Human and Primate Brains: Rhythmic and Multimodal Mechanisms Underlying the Evolution and Development of Speech.

Authors:  Maëva Michon; José Zamorano-Abramson; Francisco Aboitiz
Journal:  Front Psychol       Date:  2022-03-30

3.  Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception.

Authors:  Máté Aller; Heidi Solberg Økland; Lucy J MacGregor; Helen Blank; Matthew H Davis
Journal:  J Neurosci       Date:  2022-06-27       Impact factor: 6.709

4.  MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading.

Authors:  Felix Bröhl; Anne Keitel; Christoph Kayser
Journal:  eNeuro       Date:  2022-06-27
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.