Literature DB >> 31889007

Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.

Mathieu Bourguignon1,2,3, Martijn Baart4,5, Efthymia C Kapnoula4, Nicola Molinaro4,6.   

Abstract

Lip-reading is crucial for understanding speech in challenging conditions. But how the brain extracts meaning from, silent, visual speech is still under debate. Lip-reading in silence activates the auditory cortices, but it is not known whether such activation reflects immediate synthesis of the corresponding auditory stimulus or imagery of unrelated sounds. To disentangle these possibilities, we used magnetoencephalography to evaluate how cortical activity in 28 healthy adult humans (17 females) entrained to the auditory speech envelope and lip movements (mouth opening) when listening to a spoken story without visual input (audio-only), and when seeing a silent video of a speaker articulating another story (video-only). In video-only, auditory cortical activity entrained to the absent auditory signal at frequencies <1 Hz more than to the seen lip movements. This entrainment process was characterized by an auditory-speech-to-brain delay of ∼70 ms in the left hemisphere, compared with ∼20 ms in audio-only. Entrainment to mouth opening was found in the right angular gyrus at <1 Hz, and in early visual cortices at 1-8 Hz. These findings demonstrate that the brain can use a silent lip-read signal to synthesize a coarse-grained auditory speech representation in early auditory cortices. Our data indicate the following underlying oscillatory mechanism: seeing lip movements first modulates neuronal activity in early visual cortices at frequencies that match articulatory lip movements; the right angular gyrus then extracts slower features of lip movements, mapping them onto the corresponding speech sound features; this information is fed to auditory cortices, most likely facilitating speech parsing.SIGNIFICANCE STATEMENT Lip-reading consists in decoding speech based on visual information derived from observation of a speaker's articulatory facial gestures. Lip-reading is known to improve auditory speech understanding, especially when speech is degraded. Interestingly, lip-reading in silence still activates the auditory cortices, even when participants do not know what the absent auditory signal should be. However, it was uncertain what such activation reflected. Here, using magnetoencephalographic recordings, we demonstrate that it reflects fast synthesis of the auditory stimulus rather than mental imagery of unrelated, speech or non-speech, sounds. Our results also shed light on the oscillatory dynamics underlying lip-reading.
Copyright © 2020 the authors.

Entities:  

Keywords:  audiovisual integration; lip-reading; magnetoencephalography; silent speech; speech entrainment

Mesh:

Year:  2019        PMID: 31889007      PMCID: PMC6989012          DOI: 10.1523/JNEUROSCI.1101-19.2019

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.167


  60 in total

1.  Nonparametric permutation tests for functional neuroimaging: a primer with examples.

Authors:  Thomas E Nichols; Andrew P Holmes
Journal:  Hum Brain Mapp       Date:  2002-01       Impact factor: 5.038

2.  Cortical substrates for the perception of face actions: an fMRI study of the specificity of activation for seen speech and for meaningless lower-face acts (gurning).

Authors:  R Campbell; M MacSweeney; S Surguladze; G Calvert; P McGuire; J Suckling; M J Brammer; A S David
Journal:  Brain Res Cogn Brain Res       Date:  2001-10

3.  Gradient effects of within-category phonetic variation on lexical access.

Authors:  Bob McMurray; Michael K Tanenhaus; Richard N Aslin
Journal:  Cognition       Date:  2002-12

4.  Electrophysiological evidence for a multisensory speech-specific mode of perception.

Authors:  Jeroen J Stekelenburg; Jean Vroomen
Journal:  Neuropsychologia       Date:  2012-03-04       Impact factor: 3.139

5.  Phase patterns of neuronal responses reliably discriminate speech in human auditory cortex.

Authors:  Huan Luo; David Poeppel
Journal:  Neuron       Date:  2007-06-21       Impact factor: 17.173

6.  Neural responses to uninterrupted natural speech can be extracted with precise temporal resolution.

Authors:  Edmund C Lalor; John J Foxe
Journal:  Eur J Neurosci       Date:  2009-12-21       Impact factor: 3.386

7.  Eye movement of perceivers during audiovisual speech perception.

Authors:  E Vatikiotis-Bateson; I M Eigsti; S Yano; K G Munhall
Journal:  Percept Psychophys       Date:  1998-08

Review 8.  Temporal cortex activation in humans viewing eye and mouth movements.

Authors:  A Puce; T Allison; S Bentin; J C Gore; G McCarthy
Journal:  J Neurosci       Date:  1998-03-15       Impact factor: 6.167

9.  Mechanisms underlying selective neuronal tracking of attended speech at a "cocktail party".

Authors:  Elana M Zion Golumbic; Nai Ding; Stephan Bickel; Peter Lakatos; Catherine A Schevon; Guy M McKhann; Robert R Goodman; Ronald Emerson; Ashesh D Mehta; Jonathan Z Simon; David Poeppel; Charles E Schroeder
Journal:  Neuron       Date:  2013-03-06       Impact factor: 17.173

10.  MEG Insight into the Spectral Dynamics Underlying Steady Isometric Muscle Contraction.

Authors:  Mathieu Bourguignon; Harri Piitulainen; Eero Smeds; Guangyu Zhou; Veikko Jousmäki; Riitta Hari
Journal:  J Neurosci       Date:  2017-09-26       Impact factor: 6.167

View more
  9 in total

1.  Shared and modality-specific brain regions that mediate auditory and visual word comprehension.

Authors:  Anne Keitel; Joachim Gross; Christoph Kayser
Journal:  Elife       Date:  2020-08-24       Impact factor: 8.140

2.  Crossmodal Phase Reset and Evoked Responses Provide Complementary Mechanisms for the Influence of Visual Speech in Auditory Cortex.

Authors:  Pierre Mégevand; Manuel R Mercier; David M Groppe; Elana Zion Golumbic; Nima Mesgarani; Michael S Beauchamp; Charles E Schroeder; Ashesh D Mehta
Journal:  J Neurosci       Date:  2020-10-06       Impact factor: 6.167

Review 3.  Faces and Voices Processing in Human and Primate Brains: Rhythmic and Multimodal Mechanisms Underlying the Evolution and Development of Speech.

Authors:  Maëva Michon; José Zamorano-Abramson; Francisco Aboitiz
Journal:  Front Psychol       Date:  2022-03-30

4.  Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception.

Authors:  Máté Aller; Heidi Solberg Økland; Lucy J MacGregor; Helen Blank; Matthew H Davis
Journal:  J Neurosci       Date:  2022-06-27       Impact factor: 6.709

5.  MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading.

Authors:  Felix Bröhl; Anne Keitel; Christoph Kayser
Journal:  eNeuro       Date:  2022-06-27

6.  Auditory detection is modulated by theta phase of silent lip movements.

Authors:  Emmanuel Biau; Danying Wang; Hyojin Park; Ole Jensen; Simon Hanslmayr
Journal:  Curr Res Neurobiol       Date:  2021-06-12

7.  Electrophysiological Dynamics of Visual Speech Processing and the Role of Orofacial Effectors for Cross-Modal Predictions.

Authors:  Maëva Michon; Gonzalo Boncompte; Vladimir López
Journal:  Front Hum Neurosci       Date:  2020-10-27       Impact factor: 3.169

Review 8.  Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review.

Authors:  Collins Opoku-Baah; Adriana M Schoenhaut; Sarah G Vassall; David A Tovar; Ramnarayan Ramachandran; Mark T Wallace
Journal:  J Assoc Res Otolaryngol       Date:  2021-05-20

Review 9.  Multisensory Integration: Is Medial Prefrontal Cortex Signaling Relevant for the Treatment of Higher-Order Visual Dysfunctions?

Authors:  Miguel Skirzewski; Stéphane Molotchnikoff; Luis F Hernandez; José Fernando Maya-Vetencourt
Journal:  Front Mol Neurosci       Date:  2022-01-17       Impact factor: 5.639

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.