Literature DB >> 12063136

A comparison of bound and unbound audio-visual information processing in the human cerebral cortex.

Ingrid R Olson1, J Christopher Gatenby, John C Gore.   

Abstract

Human speech has auditory (heard speech) and visual (seen speech) qualities. The neural representation of audiovisual integration in speech was investigated using functional magnetic resonance imaging (fMRI). Ten subjects were imaged while viewing a face in four different conditions: with speech and mouth movements synchronized, with speech and mouth movements desynchronized, during silent speech, or while viewing a static face. Subtractions of the different sets of images showed that lipreading primarily activated the STG/STS. Synchronized audio-visual speech and desynchronized audio-visual speech activated similar areas. Regions activated more in the synchronized versus the desynchronized conditions were considered to be those involved in cross-modal integration. One dominant activation focus was found near the left claustrum, a subcortical region. A region-of-interest analysis of the STS and parietal areas found no difference between audio-visual conditions. However, this analysis found that synchronized audio-visual stimuli led to a higher signal change in the claustrum region. This study extends previous results, using other sensory combinations, and other tasks, indicating involvement of the claustrum in sensory integration.

Entities:  

Mesh:

Year:  2002        PMID: 12063136     DOI: 10.1016/s0926-6410(02)00067-8

Source DB:  PubMed          Journal:  Brain Res Cogn Brain Res        ISSN: 0926-6410


  41 in total

1.  Bimodal speech: early suppressive visual effects in human auditory cortex.

Authors:  Julien Besle; Alexandra Fort; Claude Delpuech; Marie-Hélène Giard
Journal:  Eur J Neurosci       Date:  2004-10       Impact factor: 3.386

2.  Neural correlates of interindividual differences in children's audiovisual speech perception.

Authors:  Audrey R Nath; Eswen E Fava; Michael S Beauchamp
Journal:  J Neurosci       Date:  2011-09-28       Impact factor: 6.167

3.  Anatomical changes in the emerging adult brain: a voxel-based morphometry study.

Authors:  Craig M Bennett; Abigail A Baird
Journal:  Hum Brain Mapp       Date:  2006-09       Impact factor: 5.038

4.  Perceptual fusion and stimulus coincidence in the cross-modal integration of speech.

Authors:  Lee M Miller; Mark D'Esposito
Journal:  J Neurosci       Date:  2005-06-22       Impact factor: 6.167

5.  Neural responses elicited to face motion and vocalization pairings.

Authors:  Aina Puce; James A Epling; James C Thompson; Olivia K Carrick
Journal:  Neuropsychologia       Date:  2007-01-07       Impact factor: 3.139

6.  Hearing lips and seeing voices: how cortical areas supporting speech production mediate audiovisual speech perception.

Authors:  Jeremy I Skipper; Virginie van Wassenhove; Howard C Nusbaum; Steven L Small
Journal:  Cereb Cortex       Date:  2007-01-11       Impact factor: 5.357

7.  Visual influences on perception of speech and nonspeech vocal-tract events.

Authors:  Lawrence Brancazio; Catherine T Best; Carol A Fowler
Journal:  Lang Speech       Date:  2006       Impact factor: 1.500

8.  Neural development of networks for audiovisual speech comprehension.

Authors:  Anthony Steven Dick; Ana Solodkin; Steven L Small
Journal:  Brain Lang       Date:  2009-09-24       Impact factor: 2.381

9.  The role of the posterior superior temporal sulcus in audiovisual processing.

Authors:  Julia Hocking; Cathy J Price
Journal:  Cereb Cortex       Date:  2008-02-14       Impact factor: 5.357

10.  A multisensory cortical network for understanding speech in noise.

Authors:  Christopher W Bishop; Lee M Miller
Journal:  J Cogn Neurosci       Date:  2009-09       Impact factor: 3.225

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.