Literature DB >> 18495091

Quantified acoustic-optical speech signal incongruity identifies cortical sites of audiovisual speech processing.

Lynne E Bernstein1, Zhong-Lin Lu, Jintao Jiang.   

Abstract

A fundamental question about human perception is how the speech perceiving brain combines auditory and visual phonetic stimulus information. We assumed that perceivers learn the normal relationship between acoustic and optical signals. We hypothesized that when the normal relationship is perturbed by mismatching the acoustic and optical signals, cortical areas responsible for audiovisual stimulus integration respond as a function of the magnitude of the mismatch. To test this hypothesis, in a previous study, we developed quantitative measures of acoustic-optical speech stimulus incongruity that correlate with perceptual measures. In the current study, we presented low incongruity (LI, matched), medium incongruity (MI, moderately mismatched), and high incongruity (HI, highly mismatched) audiovisual nonsense syllable stimuli during fMRI scanning. Perceptual responses differed as a function of the incongruity level, and BOLD measures were found to vary regionally and quantitatively with perceptual and quantitative incongruity levels. Each increase in the level of incongruity resulted in an increase in overall levels of cortical activity and in additional activations. However, the only cortical region that demonstrated differential sensitivity to the three stimulus incongruity levels (HI>MI>LI) was a subarea of the left supramarginal gyrus (SMG). The left SMG might support a fine-grained analysis of the relationship between audiovisual phonetic input in comparison with stored knowledge, as hypothesized here. The methods here show that quantitative manipulation of stimulus incongruity is a new and powerful tool for disclosing the system that processes audiovisual speech stimuli.

Entities:  

Mesh:

Year:  2008        PMID: 18495091      PMCID: PMC2584162          DOI: 10.1016/j.brainres.2008.04.018

Source DB:  PubMed          Journal:  Brain Res        ISSN: 0006-8993            Impact factor:   3.252


  46 in total

1.  Response amplification in sensory-specific cortices during crossmodal binding.

Authors:  G A Calvert; M J Brammer; E T Bullmore; R Campbell; S D Iversen; A S David
Journal:  Neuroreport       Date:  1999-08-20       Impact factor: 1.837

2.  Speech perception without hearing.

Authors:  L E Bernstein; M E Demorest; P E Tucker
Journal:  Percept Psychophys       Date:  2000-02

3.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex.

Authors:  G A Calvert; R Campbell; M J Brammer
Journal:  Curr Biol       Date:  2000-06-01       Impact factor: 10.834

Review 4.  Crossmodal processing in the human brain: insights from functional neuroimaging studies.

Authors:  G A Calvert
Journal:  Cereb Cortex       Date:  2001-12       Impact factor: 5.357

5.  Processing of changes in visual speech in the human auditory cortex.

Authors:  Riikka Möttönen; Christina M Krause; Kaisa Tiippana; Mikko Sams
Journal:  Brain Res Cogn Brain Res       Date:  2002-05

6.  Speech listening specifically modulates the excitability of tongue muscles: a TMS study.

Authors:  Luciano Fadiga; Laila Craighero; Giovanni Buccino; Giacomo Rizzolatti
Journal:  Eur J Neurosci       Date:  2002-01       Impact factor: 3.386

7.  Thresholding of statistical maps in functional neuroimaging using the false discovery rate.

Authors:  Christopher R Genovese; Nicole A Lazar; Thomas Nichols
Journal:  Neuroimage       Date:  2002-04       Impact factor: 6.556

8.  Identification of a pathway for intelligible speech in the left temporal lobe.

Authors:  S K Scott; C C Blank; S Rosen; R J Wise
Journal:  Brain       Date:  2000-12       Impact factor: 13.501

9.  A novel approach to study audiovisual integration in speech perception: localizer fMRI and sparse sampling.

Authors:  Gregor Rafael Szycik; Peggy Tausche; Thomas F Münte
Journal:  Brain Res       Date:  2007-08-19       Impact factor: 3.252

10.  Lip-reading ability and patterns of cortical activation studied using fMRI.

Authors:  C N Ludman; A Q Summerfield; D Hall; M Elliott; J Foster; J L Hykin; R Bowtell; P G Morris
Journal:  Br J Audiol       Date:  2000-08
View more
  21 in total

Review 1.  Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

Authors:  Nicholas Altieri; David B Pisoni; James T Townsend
Journal:  Seeing Perceiving       Date:  2011-09-29

2.  Dynamic changes in superior temporal sulcus connectivity during perception of noisy audiovisual speech.

Authors:  Audrey R Nath; Michael S Beauchamp
Journal:  J Neurosci       Date:  2011-02-02       Impact factor: 6.167

3.  Mismatch negativity with visual-only and audiovisual speech.

Authors:  Curtis W Ponton; Lynne E Bernstein; Edward T Auer
Journal:  Brain Topogr       Date:  2009-04-30       Impact factor: 3.020

4.  A visual or tactile signal makes auditory speech detection more efficient by reducing uncertainty.

Authors:  Bosco S Tjan; Ewen Chao; Lynne E Bernstein
Journal:  Eur J Neurosci       Date:  2014-01-09       Impact factor: 3.386

5.  Audiovisual integration as conflict resolution: The conflict of the McGurk illusion.

Authors:  Luis Morís Fernández; Emiliano Macaluso; Salvador Soto-Faraco
Journal:  Hum Brain Mapp       Date:  2017-08-09       Impact factor: 5.038

6.  Speech comprehension aided by multiple modalities: behavioural and neural interactions.

Authors:  Carolyn McGettigan; Andrew Faulkner; Irene Altarelli; Jonas Obleser; Harriet Baverstock; Sophie K Scott
Journal:  Neuropsychologia       Date:  2012-01-17       Impact factor: 3.139

7.  Psychophysics of the McGurk and other audiovisual speech integration effects.

Authors:  Jintao Jiang; Lynne E Bernstein
Journal:  J Exp Psychol Hum Percept Perform       Date:  2011-08       Impact factor: 3.332

8.  Neural development of networks for audiovisual speech comprehension.

Authors:  Anthony Steven Dick; Ana Solodkin; Steven L Small
Journal:  Brain Lang       Date:  2009-09-24       Impact factor: 2.381

Review 9.  Multisensory connections of monkey auditory cerebral cortex.

Authors:  John F Smiley; Arnaud Falchier
Journal:  Hear Res       Date:  2009-07-18       Impact factor: 3.208

10.  Lip-reading aids word recognition most in moderate noise: a Bayesian explanation using high-dimensional feature space.

Authors:  Wei Ji Ma; Xiang Zhou; Lars A Ross; John J Foxe; Lucas C Parra
Journal:  PLoS One       Date:  2009-03-04       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.