Literature DB >> 12868643

Bayesian integration of visual and auditory signals for spatial localization.

Peter W Battaglia1, Robert A Jacobs, Richard N Aslin.   

Abstract

Human observers localize events in the world by using sensory signals from multiple modalities. We evaluated two theories of spatial localization that predict how visual and auditory information are weighted when these signals specify different locations in space. According to one theory (visual capture), the signal that is typically most reliable dominates in a winner-take-all competition, whereas the other theory (maximum-likelihood estimation) proposes that perceptual judgments are based on a weighted average of the sensory signals in proportion to each signal's relative reliability. Our results indicate that both theories are partially correct, in that relative signal reliability significantly altered judgments of spatial location, but these judgments were also characterized by an overall bias to rely on visual over auditory information. These results have important implications for the development of cue integration and for neural plasticity in the adult brain that enables humans to optimally integrate multimodal information.

Entities:  

Mesh:

Year:  2003        PMID: 12868643     DOI: 10.1364/josaa.20.001391

Source DB:  PubMed          Journal:  J Opt Soc Am A Opt Image Sci Vis        ISSN: 1084-7529            Impact factor:   2.129


  116 in total

Review 1.  Knowing how much you don't know: a neural organization of uncertainty estimates.

Authors:  Dominik R Bach; Raymond J Dolan
Journal:  Nat Rev Neurosci       Date:  2012-07-11       Impact factor: 34.870

2.  Auditory temporal modulation of the visual Ternus effect: the influence of time interval.

Authors:  Zhuanghua Shi; Lihan Chen; Hermann J Müller
Journal:  Exp Brain Res       Date:  2010-05-16       Impact factor: 1.972

3.  Assessing the effect of visual and tactile distractors on the perception of auditory apparent motion.

Authors:  Daniel Sanabria; Salvador Soto-Faraco; Charles Spence
Journal:  Exp Brain Res       Date:  2005-08-26       Impact factor: 1.972

4.  Perception of angular displacement without landmarks: evidence for Bayesian fusion of vestibular, optokinetic, podokinesthetic, and cognitive information.

Authors:  Reinhart Jürgens; Wolfgang Becker
Journal:  Exp Brain Res       Date:  2006-07-11       Impact factor: 1.972

5.  Combining priors and noisy visual cues in a rapid pointing task.

Authors:  Hadley Tassinari; Todd E Hudson; Michael S Landy
Journal:  J Neurosci       Date:  2006-10-04       Impact factor: 6.167

6.  Auditory motion affects visual motion perception in a speeded discrimination task.

Authors:  Daniel Sanabria; Juan Lupiáñez; Charles Spence
Journal:  Exp Brain Res       Date:  2007-03-13       Impact factor: 1.972

Review 7.  Decision theory, reinforcement learning, and the brain.

Authors:  Peter Dayan; Nathaniel D Daw
Journal:  Cogn Affect Behav Neurosci       Date:  2008-12       Impact factor: 3.282

8.  Effects of augmentative visual training on audio-motor mapping.

Authors:  Gabrielle L Hands; Eric Larson; Cara E Stepp
Journal:  Hum Mov Sci       Date:  2014-02-12       Impact factor: 2.161

9.  Dynamic reweighting of visual and vestibular cues during self-motion perception.

Authors:  Christopher R Fetsch; Amanda H Turner; Gregory C DeAngelis; Dora E Angelaki
Journal:  J Neurosci       Date:  2009-12-09       Impact factor: 6.167

10.  Multisensory oddity detection as bayesian inference.

Authors:  Timothy Hospedales; Sethu Vijayakumar
Journal:  PLoS One       Date:  2009-01-15       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.