Literature DB >> 25890390

Prediction and constraint in audiovisual speech perception.

Jonathan E Peelle1, Mitchell S Sommers2.   

Abstract

During face-to-face conversational speech listeners must efficiently process a rapid and complex stream of multisensory information. Visual speech can serve as a critical complement to auditory information because it provides cues to both the timing of the incoming acoustic signal (the amplitude envelope, influencing attention and perceptual sensitivity) and its content (place and manner of articulation, constraining lexical selection). Here we review behavioral and neurophysiological evidence regarding listeners' use of visual speech information. Multisensory integration of audiovisual speech cues improves recognition accuracy, particularly for speech in noise. Even when speech is intelligible based solely on auditory information, adding visual information may reduce the cognitive demands placed on listeners through increasing the precision of prediction. Electrophysiological studies demonstrate that oscillatory cortical entrainment to speech in auditory cortex is enhanced when visual speech is present, increasing sensitivity to important acoustic cues. Neuroimaging studies also suggest increased activity in auditory cortex when congruent visual information is available, but additionally emphasize the involvement of heteromodal regions of posterior superior temporal sulcus as playing a role in integrative processing. We interpret these findings in a framework of temporally-focused lexical competition in which visual speech information affects auditory processing to increase sensitivity to acoustic information through an early integration mechanism, and a late integration stage that incorporates specific information about a speaker's articulators to constrain the number of possible candidates in a spoken utterance. Ultimately it is words compatible with both auditory and visual information that most strongly determine successful speech perception during everyday listening. Thus, audiovisual speech perception is accomplished through multiple stages of integration, supported by distinct neuroanatomical mechanisms.
Copyright © 2015 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Audiovisual speech; Multisensory integration; Predictive coding; Predictive timing; Speech perception

Mesh:

Year:  2015        PMID: 25890390      PMCID: PMC4475441          DOI: 10.1016/j.cortex.2015.03.006

Source DB:  PubMed          Journal:  Cortex        ISSN: 0010-9452            Impact factor:   4.027


  93 in total

1.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex.

Authors:  G A Calvert; R Campbell; M J Brammer
Journal:  Curr Biol       Date:  2000-06-01       Impact factor: 10.834

2.  Polysensory interactions along lateral temporal regions evoked by audiovisual speech.

Authors:  Tarra M Wright; Kevin A Pelphrey; Truett Allison; Martin J McKeown; Gregory McCarthy
Journal:  Cereb Cortex       Date:  2003-10       Impact factor: 5.357

3.  Detectability of auditory signals presented without defined observation intervals.

Authors:  C S Watson; T L Nichols
Journal:  J Acoust Soc Am       Date:  1976-03       Impact factor: 1.840

4.  Phase patterns of neuronal responses reliably discriminate speech in human auditory cortex.

Authors:  Huan Luo; David Poeppel
Journal:  Neuron       Date:  2007-06-21       Impact factor: 17.173

5.  The role of visual speech information in supporting perceptual learning of degraded speech.

Authors:  Rachel V Wayne; Ingrid S Johnsrude
Journal:  J Exp Psychol Appl       Date:  2012-12

6.  Rhythmicity and cross-modal temporal cues facilitate detection.

Authors:  Sanne ten Oever; Charles E Schroeder; David Poeppel; Nienke van Atteveldt; Elana Zion-Golumbic
Journal:  Neuropsychologia       Date:  2014-08-13       Impact factor: 3.139

7.  The natural statistics of audiovisual speech.

Authors:  Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A Ghazanfar
Journal:  PLoS Comput Biol       Date:  2009-07-17       Impact factor: 4.475

8.  Auditory and visual lexical neighborhoods in audiovisual speech perception.

Authors:  Nancy Tye-Murray; Mitchell Sommers; Brent Spehar
Journal:  Trends Amplif       Date:  2007-12

9.  Reading your own lips: common-coding theory and visual speech perception.

Authors:  Nancy Tye-Murray; Brent P Spehar; Joel Myerson; Sandra Hale; Mitchell S Sommers
Journal:  Psychon Bull Rev       Date:  2013-02

10.  Cross-frequency coupling between neuronal oscillations.

Authors:  Ole Jensen; Laura L Colgin
Journal:  Trends Cogn Sci       Date:  2007-06-04       Impact factor: 20.229

View more
  50 in total

1.  Oscillatory phase shapes syllable perception.

Authors:  Sanne ten Oever; Alexander T Sack
Journal:  Proc Natl Acad Sci U S A       Date:  2015-12-14       Impact factor: 11.205

2.  Electrocorticography reveals continuous auditory and visual speech tracking in temporal and occipital cortex.

Authors:  Cristiano Micheli; Inga M Schepers; Müge Ozker; Daniel Yoshor; Michael S Beauchamp; Jochem W Rieger
Journal:  Eur J Neurosci       Date:  2018-08-12       Impact factor: 3.386

3.  Acoustic noise and vision differentially warp the auditory categorization of speech.

Authors:  Gavin M Bidelman; Lauren Sigley; Gwyneth A Lewis
Journal:  J Acoust Soc Am       Date:  2019-07       Impact factor: 1.840

4.  Audiovisual sentence recognition not predicted by susceptibility to the McGurk effect.

Authors:  Kristin J Van Engen; Zilong Xie; Bharath Chandrasekaran
Journal:  Atten Percept Psychophys       Date:  2017-02       Impact factor: 2.199

5.  Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.

Authors:  Michael J Crosse; Giovanni M Di Liberto; Edmund C Lalor
Journal:  J Neurosci       Date:  2016-09-21       Impact factor: 6.167

6.  Anomalous network architecture of the resting brain in children who stutter.

Authors:  Soo-Eun Chang; Michael Angstadt; Ho Ming Chow; Andrew C Etchell; Emily O Garnett; Ai Leen Choo; Daniel Kessler; Robert C Welsh; Chandra Sripada
Journal:  J Fluency Disord       Date:  2017-01-25       Impact factor: 2.538

7.  Experiments on Auditory-Visual Perception of Sentences by Users of Unilateral, Bimodal, and Bilateral Cochlear Implants.

Authors:  Michael F Dorman; Julie Liss; Shuai Wang; Visar Berisha; Cimarron Ludwig; Sarah Cook Natale
Journal:  J Speech Lang Hear Res       Date:  2016-12-01       Impact factor: 2.297

8.  Face viewing behavior predicts multisensory gain during speech perception.

Authors:  Johannes Rennig; Kira Wegner-Clemens; Michael S Beauchamp
Journal:  Psychon Bull Rev       Date:  2020-02

9.  Neural evidence accounting for interindividual variability of the McGurk illusion.

Authors:  Antoine J Shahin
Journal:  Neurosci Lett       Date:  2019-06-07       Impact factor: 3.046

10.  Shared and modality-specific brain regions that mediate auditory and visual word comprehension.

Authors:  Anne Keitel; Joachim Gross; Christoph Kayser
Journal:  Elife       Date:  2020-08-24       Impact factor: 8.140

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.