Literature DB >> 18823249

A multisensory cortical network for understanding speech in noise.

Christopher W Bishop1, Lee M Miller.   

Abstract

In noisy environments, listeners tend to hear a speaker's voice yet struggle to understand what is said. The most effective way to improve intelligibility in such conditions is to watch the speaker's mouth movements. Here we identify the neural networks that distinguish understanding from merely hearing speech, and determine how the brain applies visual information to improve intelligibility. Using functional magnetic resonance imaging, we show that understanding speech-in-noise is supported by a network of brain areas including the left superior parietal lobule, the motor/premotor cortex, and the left anterior superior temporal sulcus (STS), a likely apex of the acoustic processing hierarchy. Multisensory integration likely improves comprehension through improved communication between the left temporal-occipital boundary, the left medial-temporal lobe, and the left STS. This demonstrates how the brain uses information from multiple modalities to improve speech comprehension in naturalistic, acoustically adverse conditions.

Entities:  

Mesh:

Substances:

Year:  2009        PMID: 18823249      PMCID: PMC2833290          DOI: 10.1162/jocn.2009.21118

Source DB:  PubMed          Journal:  J Cogn Neurosci        ISSN: 0898-929X            Impact factor:   3.225


  88 in total

1.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex.

Authors:  G A Calvert; R Campbell; M J Brammer
Journal:  Curr Biol       Date:  2000-06-01       Impact factor: 10.834

2.  Polysensory interactions along lateral temporal regions evoked by audiovisual speech.

Authors:  Tarra M Wright; Kevin A Pelphrey; Truett Allison; Martin J McKeown; Gregory McCarthy
Journal:  Cereb Cortex       Date:  2003-10       Impact factor: 5.357

3.  Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information.

Authors:  Daniel E Callan; Jeffery A Jones; Kevin Munhall; Christian Kroos; Akiko M Callan; Eric Vatikiotis-Bateson
Journal:  J Cogn Neurosci       Date:  2004-06       Impact factor: 3.225

4.  Measuring functional connectivity during distinct stages of a cognitive task.

Authors:  Jesse Rissman; Adam Gazzaley; Mark D'Esposito
Journal:  Neuroimage       Date:  2004-10       Impact factor: 6.556

5.  Perceptual fusion and stimulus coincidence in the cross-modal integration of speech.

Authors:  Lee M Miller; Mark D'Esposito
Journal:  J Neurosci       Date:  2005-06-22       Impact factor: 6.167

6.  Hearing lips and seeing voices: how cortical areas supporting speech production mediate audiovisual speech perception.

Authors:  Jeremy I Skipper; Virginie van Wassenhove; Howard C Nusbaum; Steven L Small
Journal:  Cereb Cortex       Date:  2007-01-11       Impact factor: 5.357

7.  Language processing within the human medial temporal lobe.

Authors:  Patric Meyer; Axel Mecklinger; Thomas Grunwald; Juergen Fell; Christian E Elger; Angela D Friederici
Journal:  Hippocampus       Date:  2005       Impact factor: 3.899

Review 8.  Multimodal integration for the representation of space in the posterior parietal cortex.

Authors:  R A Andersen
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  1997-10-29       Impact factor: 6.237

Review 9.  Effects of phonetic context on audio-visual intelligibility of French.

Authors:  C Benoît; T Mohamadi; S Kandel
Journal:  J Speech Hear Res       Date:  1994-10

10.  FMRI responses to video and point-light displays of moving humans and manipulable objects.

Authors:  Michael S Beauchamp; Kathryn E Lee; James V Haxby; Alex Martin
Journal:  J Cogn Neurosci       Date:  2003-10-01       Impact factor: 3.225

View more
  47 in total

1.  Neural time course of visually enhanced echo suppression.

Authors:  Christopher W Bishop; Sam London; Lee M Miller
Journal:  J Neurophysiol       Date:  2012-07-11       Impact factor: 2.714

2.  EEG gamma-band activity during audiovisual speech comprehension in different noise environments.

Authors:  Yanfei Lin; Baolin Liu; Zhiwen Liu; Xiaorong Gao
Journal:  Cogn Neurodyn       Date:  2015-02-22       Impact factor: 5.082

3.  Rapid tuning of auditory "what" and "where" pathways by training.

Authors:  Yi Du; Yu He; Stephen R Arnott; Bernhard Ross; Xihong Wu; Liang Li; Claude Alain
Journal:  Cereb Cortex       Date:  2013-09-15       Impact factor: 5.357

4.  Testing sensory and multisensory function in children with autism spectrum disorder.

Authors:  Sarah H Baum; Ryan A Stevenson; Mark T Wallace
Journal:  J Vis Exp       Date:  2015-04-22       Impact factor: 1.355

5.  Cortical Tracking of Speech-in-Noise Develops from Childhood to Adulthood.

Authors:  Marc Vander Ghinst; Mathieu Bourguignon; Maxime Niesen; Vincent Wens; Sergio Hassid; Georges Choufani; Veikko Jousmäki; Riitta Hari; Serge Goldman; Xavier De Tiège
Journal:  J Neurosci       Date:  2019-02-11       Impact factor: 6.167

6.  Multisensory speech perception in autism spectrum disorder: From phoneme to whole-word perception.

Authors:  Ryan A Stevenson; Sarah H Baum; Magali Segers; Susanne Ferber; Morgan D Barense; Mark T Wallace
Journal:  Autism Res       Date:  2017-03-24       Impact factor: 5.216

7.  The role of the arcuate and middle longitudinal fasciculi in speech perception in noise in adulthood.

Authors:  Pascale Tremblay; Maxime Perron; Isabelle Deschamps; Dan Kennedy-Higgins; Jean-Christophe Houde; Anthony Steven Dick; Maxime Descoteaux
Journal:  Hum Brain Mapp       Date:  2018-09-12       Impact factor: 5.038

8.  Shared and modality-specific brain regions that mediate auditory and visual word comprehension.

Authors:  Anne Keitel; Joachim Gross; Christoph Kayser
Journal:  Elife       Date:  2020-08-24       Impact factor: 8.140

9.  Left posterior temporal regions are sensitive to auditory categorization.

Authors:  Rutvik Desai; Einat Liebenthal; Eric Waldron; Jeffrey R Binder
Journal:  J Cogn Neurosci       Date:  2008-07       Impact factor: 3.225

10.  An ALE meta-analysis on the audiovisual integration of speech signals.

Authors:  Laura C Erickson; Elizabeth Heeg; Josef P Rauschecker; Peter E Turkeltaub
Journal:  Hum Brain Mapp       Date:  2014-07-04       Impact factor: 5.038

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.