Literature DB >> 19757901

Bayesian priors are encoded independently from likelihoods in human multisensory perception.

Ulrik R Beierholm1, Steven R Quartz, Ladan Shams.   

Abstract

It has been shown that human combination of crossmodal information is highly consistent with an optimal Bayesian model performing causal inference. These findings have shed light on the computational principles governing crossmodal integration/segregation. Intuitively, in a Bayesian framework priors represent a priori information about the environment, i.e., information available prior to encountering the given stimuli, and are thus not dependent on the current stimuli. While this interpretation is considered as a defining characteristic of Bayesian computation by many, the Bayes rule per se does not require that priors remain constant despite significant changes in the stimulus, and therefore, the demonstration of Bayes-optimality of a task does not imply the invariance of priors to varying likelihoods. This issue has not been addressed before, but here we empirically investigated the independence of the priors from the likelihoods by strongly manipulating the presumed likelihoods (by using two drastically different sets of stimuli) and examining whether the estimated priors change or remain the same. The results suggest that the estimated prior probabilities are indeed independent of the immediate input and hence, likelihood.

Entities:  

Mesh:

Year:  2009        PMID: 19757901     DOI: 10.1167/9.5.23

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  35 in total

1.  Contextual factors multiplex to control multisensory processes.

Authors:  Beatriz R Sarmiento; Pawel J Matusz; Daniel Sanabria; Micah M Murray
Journal:  Hum Brain Mapp       Date:  2015-10-15       Impact factor: 5.038

2.  Supramodal representation of temporal priors calibrates interval timing.

Authors:  Huihui Zhang; Xiaolin Zhou
Journal:  J Neurophysiol       Date:  2017-06-14       Impact factor: 2.714

3.  Stimulus intensity modulates multisensory temporal processing.

Authors:  Juliane Krueger Fister; Ryan A Stevenson; Aaron R Nidiffer; Zachary P Barnett; Mark T Wallace
Journal:  Neuropsychologia       Date:  2016-02-23       Impact factor: 3.139

Review 4.  Multisensory constraints on awareness.

Authors:  Ophelia Deroy; Yi-Chuan Chen; Charles Spence
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2014-03-17       Impact factor: 6.237

Review 5.  Bayesian models: the structure of the world, uncertainty, behavior, and the brain.

Authors:  Iris Vilares; Konrad Kording
Journal:  Ann N Y Acad Sci       Date:  2011-04       Impact factor: 5.691

Review 6.  The COGs (context, object, and goals) in multisensory processing.

Authors:  Sanne ten Oever; Vincenzo Romei; Nienke van Atteveldt; Salvador Soto-Faraco; Micah M Murray; Pawel J Matusz
Journal:  Exp Brain Res       Date:  2016-03-01       Impact factor: 1.972

7.  Oscillatory Properties of Functional Connections Between Sensory Areas Mediate Cross-Modal Illusory Perception.

Authors:  Jason Cooke; Claudia Poch; Helge Gillmeister; Marcello Costantini; Vincenzo Romei
Journal:  J Neurosci       Date:  2019-05-20       Impact factor: 6.167

Review 8.  The Influence of Auditory Cues on Bodily and Movement Perception.

Authors:  Tasha R Stanton; Charles Spence
Journal:  Front Psychol       Date:  2020-01-17

9.  Modality-specific attention attenuates visual-tactile integration and recalibration effects by reducing prior expectations of a common source for vision and touch.

Authors:  Stephanie Badde; Karen T Navarro; Michael S Landy
Journal:  Cognition       Date:  2020-02-06

10.  Comparison of congruence judgment and auditory localization tasks for assessing the spatial limits of visual capture.

Authors:  Adam K Bosen; Justin T Fleming; Sarah E Brown; Paul D Allen; William E O'Neill; Gary D Paige
Journal:  Biol Cybern       Date:  2016-11-04       Impact factor: 2.086

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.