Literature DB >> 17027392

Combining visual and auditory information.

David Burr1, David Alais.   

Abstract

Robust perception requires that information from by our five different senses be combined at some central level to produce a single unified percept of the world. Recent theory and evidence from many laboratories suggests that the combination does not occur in a rigid, hardwired fashion, but follows flexible situation-dependent rules that allow information to be combined with maximal efficiency. In this review we discuss recent evidence from our laboratories investigating how information from auditory and visual modalities is combined. The results support the notion of Bayesian combination. We also examine temporal alignment of auditory and visual signals, and show that perceived simultaneity does not depend solely on neural latencies, but involves active processes that compensate, for example, for the physical delay introduced by the relatively slow speed of sound. Finally, we go on to show that although visual and auditory information is combined to maximize efficiency, attentional resources for the two modalities are largely independent.

Mesh:

Year:  2006        PMID: 17027392     DOI: 10.1016/S0079-6123(06)55014-9

Source DB:  PubMed          Journal:  Prog Brain Res        ISSN: 0079-6123            Impact factor:   2.453


  26 in total

1.  The processing of visual and auditory information for reaching movements.

Authors:  Cheryl M Glazebrook; Timothy N Welsh; Luc Tremblay
Journal:  Psychol Res       Date:  2015-08-08

2.  Coordination of Orofacial Motor Actions into Exploratory Behavior by Rat.

Authors:  Anastasia Kurnikova; Jeffrey D Moore; Song-Mao Liao; Martin Deschênes; David Kleinfeld
Journal:  Curr Biol       Date:  2017-02-16       Impact factor: 10.834

3.  Audiovisual temporal capture underlies flash fusion.

Authors:  Takahiro Kawabe
Journal:  Exp Brain Res       Date:  2009-06-12       Impact factor: 1.972

4.  Multiple modes of phase locking between sniffing and whisking during active exploration.

Authors:  Sachin Ranade; Balázs Hangya; Adam Kepecs
Journal:  J Neurosci       Date:  2013-05-08       Impact factor: 6.167

5.  Compensations in response to real-time formant perturbations of different magnitudes.

Authors:  Ewen N MacDonald; Robyn Goldberg; Kevin G Munhall
Journal:  J Acoust Soc Am       Date:  2010-02       Impact factor: 1.840

6.  Synchronization to auditory and visual rhythms in hearing and deaf individuals.

Authors:  John R Iversen; Aniruddh D Patel; Brenda Nicodemus; Karen Emmorey
Journal:  Cognition       Date:  2014-11-19

7.  Efficient visual search from synchronized auditory signals requires transient audiovisual events.

Authors:  Erik Van der Burg; John Cass; Christian N L Olivers; Jan Theeuwes; David Alais
Journal:  PLoS One       Date:  2010-05-14       Impact factor: 3.240

8.  Multisensory perceptual learning of temporal order: audiovisual learning transfers to vision but not audition.

Authors:  David Alais; John Cass
Journal:  PLoS One       Date:  2010-06-23       Impact factor: 3.240

Review 9.  Multimodal activity in the parietal cortex.

Authors:  Yale E Cohen
Journal:  Hear Res       Date:  2009-02-06       Impact factor: 3.208

Review 10.  Interactions of auditory and visual stimuli in space and time.

Authors:  Gregg H Recanzone
Journal:  Hear Res       Date:  2009-04-22       Impact factor: 3.208

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.