Literature DB >> 15450102

Bimodal speech: early suppressive visual effects in human auditory cortex.

Julien Besle1, Alexandra Fort, Claude Delpuech, Marie-Hélène Giard.   

Abstract

While everyone has experienced that seeing lip movements may improve speech perception, little is known about the neural mechanisms by which audiovisual speech information is combined. Event-related potentials (ERPs) were recorded while subjects performed an auditory recognition task among four different natural syllables randomly presented in the auditory (A), visual (V) or congruent bimodal (AV) condition. We found that: (i) bimodal syllables were identified more rapidly than auditory alone stimuli; (ii) this behavioural facilitation was associated with cross-modal [AV-(A+V)] ERP effects around 120-190 ms latency, expressed mainly as a decrease of unimodal N1 generator activities in the auditory cortex. This finding provides evidence for suppressive, speech-specific audiovisual integration mechanisms, which are likely to be related to the dominance of the auditory modality for speech perception. Furthermore, the latency of the effect indicates that integration operates at pre-representational stages of stimulus analysis, probably via feedback projections from visual and/or polymodal areas.

Entities:  

Mesh:

Year:  2004        PMID: 15450102      PMCID: PMC1885424          DOI: 10.1111/j.1460-9568.2004.03670.x

Source DB:  PubMed          Journal:  Eur J Neurosci        ISSN: 0953-816X            Impact factor:   3.386


  56 in total

1.  The perception of speech sounds by the human brain as reflected by the mismatch negativity (MMN) and its magnetic equivalent (MMNm).

Authors:  R Näätänen
Journal:  Psychophysiology       Date:  2001-01       Impact factor: 4.016

2.  Bisensory augmentation: a speechreading advantage when speech is clearly audible and intact.

Authors:  P Arnold; F Hill
Journal:  Br J Psychol       Date:  2001-05

3.  Mismatch negativity evoked by the McGurk-MacDonald effect: a phonetic representation within short-term memory.

Authors:  C Colin; M Radeau; A Soquet; D Demolin; F Colin; P Deltenre
Journal:  Clin Neurophysiol       Date:  2002-04       Impact factor: 3.708

4.  Processing of changes in visual speech in the human auditory cortex.

Authors:  Riikka Möttönen; Christina M Krause; Kaisa Tiippana; Mikko Sams
Journal:  Brain Res Cogn Brain Res       Date:  2002-05

5.  Deactivation of sensory-specific cortex by cross-modal stimuli.

Authors:  Paul J Laurienti; Jonathan H Burdette; Mark T Wallace; Yi-Fen Yen; Aaron S Field; Barry E Stein
Journal:  J Cogn Neurosci       Date:  2002-04-01       Impact factor: 3.225

6.  The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex.

Authors:  Charles E Schroeder; John J Foxe
Journal:  Brain Res Cogn Brain Res       Date:  2002-06

7.  Anatomical evidence of multimodal integration in primate striate cortex.

Authors:  Arnaud Falchier; Simon Clavagnier; Pascal Barone; Henry Kennedy
Journal:  J Neurosci       Date:  2002-07-01       Impact factor: 6.167

8.  Early auditory-visual interactions in human cortex during nonredundant target identification.

Authors:  Alexandra Fort; Claude Delpuech; Jacques Pernier; Marie Hélène Giard
Journal:  Brain Res Cogn Brain Res       Date:  2002-06

9.  A comparison of bound and unbound audio-visual information processing in the human cerebral cortex.

Authors:  Ingrid R Olson; J Christopher Gatenby; John C Gore
Journal:  Brain Res Cogn Brain Res       Date:  2002-06

10.  Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study.

Authors:  Sophie Molholm; Walter Ritter; Micah M Murray; Daniel C Javitt; Charles E Schroeder; John J Foxe
Journal:  Brain Res Cogn Brain Res       Date:  2002-06
View more
  96 in total

1.  Audiovisual speech integration in autism spectrum disorders: ERP evidence for atypicalities in lexical-semantic processing.

Authors:  Odette Megnin; Atlanta Flitton; Catherine R G Jones; Michelle de Haan; Torsten Baldeweg; Tony Charman
Journal:  Autism Res       Date:  2011-12-09       Impact factor: 5.216

Review 2.  Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

Authors:  Nicholas Altieri; David B Pisoni; James T Townsend
Journal:  Seeing Perceiving       Date:  2011-09-29

3.  Multistage audiovisual integration of speech: dissociating identification and detection.

Authors:  Kasper Eskelund; Jyrki Tuomainen; Tobias S Andersen
Journal:  Exp Brain Res       Date:  2010-12-25       Impact factor: 1.972

4.  Attention to visual speech gestures enhances hemodynamic activity in the left planum temporale.

Authors:  Johanna Pekkola; Ville Ojanen; Taina Autti; Iiro P Jääskeläinen; Riikka Möttönen; Mikko Sams
Journal:  Hum Brain Mapp       Date:  2006-06       Impact factor: 5.038

5.  Is the auditory sensory memory sensitive to visual information?

Authors:  Julien Besle; Alexandra Fort; Marie-Hélène Giard
Journal:  Exp Brain Res       Date:  2005-07-23       Impact factor: 1.972

Review 6.  Neuronal oscillations and visual amplification of speech.

Authors:  Charles E Schroeder; Peter Lakatos; Yoshinao Kajikawa; Sarah Partan; Aina Puce
Journal:  Trends Cogn Sci       Date:  2008-02-15       Impact factor: 20.229

7.  Spatially congruent visual motion modulates activity of the primary auditory cortex.

Authors:  Mikhail Zvyagintsev; Andrey R Nikolaev; Heike Thönnessen; Olga Sachs; Jürgen Dammers; Klaus Mathiak
Journal:  Exp Brain Res       Date:  2009-05-17       Impact factor: 1.972

8.  Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus.

Authors:  Chandramouli Chandrasekaran; Asif A Ghazanfar
Journal:  J Neurophysiol       Date:  2008-11-26       Impact factor: 2.714

9.  Dynamic faces speed up the onset of auditory cortical spiking responses during vocal detection.

Authors:  Chandramouli Chandrasekaran; Luis Lemus; Asif A Ghazanfar
Journal:  Proc Natl Acad Sci U S A       Date:  2013-11-11       Impact factor: 11.205

10.  Cross-modal prediction in speech depends on prior linguistic experience.

Authors:  Carolina Sánchez-García; James T Enns; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2013-02-06       Impact factor: 1.972

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.