Literature DB >> 15200708

Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information.

Daniel E Callan1, Jeffery A Jones, Kevin Munhall, Christian Kroos, Akiko M Callan, Eric Vatikiotis-Bateson.   

Abstract

Perception of speech is improved when presentation of the audio signal is accompanied by concordant visual speech gesture information. This enhancement is most prevalent when the audio signal is degraded. One potential means by which the brain affords perceptual enhancement is thought to be through the integration of concordant information from multiple sensory channels in a common site of convergence, multisensory integration (MSI) sites. Some studies have identified potential sites in the superior temporal gyrus/sulcus (STG/S) that are responsive to multisensory information from the auditory speech signal and visual speech movement. One limitation of these studies is that they do not control for activity resulting from attentional modulation cued by such things as visual information signaling the onsets and offsets of the acoustic speech signal, as well as activity resulting from MSI of properties of the auditory speech signal with aspects of gross visual motion that are not specific to place of articulation information. This fMRI experiment uses spatial wavelet bandpass filtered Japanese sentences presented with background multispeaker audio noise to discern brain activity reflecting MSI induced by auditory and visual correspondence of place of articulation information that controls for activity resulting from the above-mentioned factors. The experiment consists of a low-frequency (LF) filtered condition containing gross visual motion of the lips, jaw, and head without specific place of articulation information, a midfrequency (MF) filtered condition containing place of articulation information, and an unfiltered (UF) condition. Sites of MSI selectively induced by auditory and visual correspondence of place of articulation information were determined by the presence of activity for both the MF and UF conditions relative to the LF condition. Based on these criteria, sites of MSI were found predominantly in the left middle temporal gyrus (MTG), and the left STG/S (including the auditory cortex). By controlling for additional factors that could also induce greater activity resulting from visual motion information, this study identifies potential MSI sites that we believe are involved with improved speech perception intelligibility.

Entities:  

Mesh:

Year:  2004        PMID: 15200708     DOI: 10.1162/089892904970771

Source DB:  PubMed          Journal:  J Cogn Neurosci        ISSN: 0898-929X            Impact factor:   3.225


  38 in total

1.  Neural correlates of interindividual differences in children's audiovisual speech perception.

Authors:  Audrey R Nath; Eswen E Fava; Michael S Beauchamp
Journal:  J Neurosci       Date:  2011-09-28       Impact factor: 6.167

Review 2.  On the use of superadditivity as a metric for characterizing multisensory integration in functional neuroimaging studies.

Authors:  Paul J Laurienti; Thomas J Perrault; Terrence R Stanford; Mark T Wallace; Barry E Stein
Journal:  Exp Brain Res       Date:  2005-06-30       Impact factor: 1.972

3.  Perceptual fusion and stimulus coincidence in the cross-modal integration of speech.

Authors:  Lee M Miller; Mark D'Esposito
Journal:  J Neurosci       Date:  2005-06-22       Impact factor: 6.167

4.  Dynamic changes in superior temporal sulcus connectivity during perception of noisy audiovisual speech.

Authors:  Audrey R Nath; Michael S Beauchamp
Journal:  J Neurosci       Date:  2011-02-02       Impact factor: 6.167

Review 5.  The processing of audio-visual speech: empirical and neural bases.

Authors:  Ruth Campbell
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2008-03-12       Impact factor: 6.237

6.  Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus.

Authors:  Chandramouli Chandrasekaran; Asif A Ghazanfar
Journal:  J Neurophysiol       Date:  2008-11-26       Impact factor: 2.714

7.  Giving speech a hand: gesture modulates activity in auditory cortex during speech perception.

Authors:  Amy L Hubbard; Stephen M Wilson; Daniel E Callan; Mirella Dapretto
Journal:  Hum Brain Mapp       Date:  2009-03       Impact factor: 5.038

8.  The differentiation of iconic and metaphoric gestures: common and unique integration processes.

Authors:  Benjamin Straube; Antonia Green; Bianca Bromberger; Tilo Kircher
Journal:  Hum Brain Mapp       Date:  2011-04       Impact factor: 5.038

9.  Auditory, Visual and Audiovisual Speech Processing Streams in Superior Temporal Sulcus.

Authors:  Jonathan H Venezia; Kenneth I Vaden; Feng Rong; Dale Maddox; Kourosh Saberi; Gregory Hickok
Journal:  Front Hum Neurosci       Date:  2017-04-07       Impact factor: 3.169

10.  A multisensory cortical network for understanding speech in noise.

Authors:  Christopher W Bishop; Lee M Miller
Journal:  J Cogn Neurosci       Date:  2009-09       Impact factor: 3.225

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.