Literature DB >> 15720773

A hierarchical Bayesian model for learning nonlinear statistical regularities in nonstationary natural signals.

Yan Karklin1, Michael S Lewicki.   

Abstract

Capturing statistical regularities in complex, high-dimensional data is an important problem in machine learning and signal processing. Models such as principal component analysis (PCA) and independent component analysis (ICA) make few assumptions about the structure in the data and have good scaling properties, but they are limited to representing linear statistical regularities and assume that the distribution of the data is stationary. For many natural, complex signals, the latent variables often exhibit residual dependencies as well as nonstationary statistics. Here we present a hierarchical Bayesian model that is able to capture higher-order nonlinear structure and represent nonstationary data distributions. The model is a generalization of ICA in which the basis function coefficients are no longer assumed to be independent; instead, the dependencies in their magnitudes are captured by a set of density components. Each density component describes a common pattern of deviation from the marginal density of the pattern ensemble; in different combinations, they can describe nonstationary distributions. Adapting the model to image or audio data yields a nonlinear, distributed code for higher-order statistical regularities that reflect more abstract, invariant properties of the signal.

Entities:  

Mesh:

Year:  2005        PMID: 15720773     DOI: 10.1162/0899766053011474

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  26 in total

1.  Perceptron learning rule derived from spike-frequency adaptation and spike-time-dependent plasticity.

Authors:  Prashanth D'Souza; Shih-Chii Liu; Richard H R Hahnloser
Journal:  Proc Natl Acad Sci U S A       Date:  2010-02-18       Impact factor: 11.205

2.  Emergence of complex cell properties by learning to generalize in natural scenes.

Authors:  Yan Karklin; Michael S Lewicki
Journal:  Nature       Date:  2008-11-19       Impact factor: 49.962

3.  Computing local edge probability in natural scenes from a population of oriented simple cells.

Authors:  Chaithanya A Ramachandra; Bartlett W Mel
Journal:  J Vis       Date:  2013-12-31       Impact factor: 2.240

4.  The impact on midlevel vision of statistically optimal divisive normalization in V1.

Authors:  Ruben Coen-Cagli; Odelia Schwartz
Journal:  J Vis       Date:  2013-07-15       Impact factor: 2.240

Review 5.  Mental imagery in animals: Learning, memory, and decision-making in the face of missing information.

Authors:  Aaron P Blaisdell
Journal:  Learn Behav       Date:  2019-09       Impact factor: 1.986

6.  Visual attention and flexible normalization pools.

Authors:  Odelia Schwartz; Ruben Coen-Cagli
Journal:  J Vis       Date:  2013-01-23       Impact factor: 2.240

7.  Incorporating naturalistic correlation structure improves spectrogram reconstruction from neuronal activity in the songbird auditory midbrain.

Authors:  Alexandro D Ramirez; Yashar Ahmadian; Joseph Schumacher; David Schneider; Sarah M N Woolley; Liam Paninski
Journal:  J Neurosci       Date:  2011-03-09       Impact factor: 6.167

Review 8.  Memory-prediction errors and their consequences in schizophrenia.

Authors:  Michael S Kraus; Richard S E Keefe; Ranga K R Krishnan
Journal:  Neuropsychol Rev       Date:  2009-07-03       Impact factor: 7.444

9.  A structured model of video reproduces primary visual cortical organisation.

Authors:  Pietro Berkes; Richard E Turner; Maneesh Sahani
Journal:  PLoS Comput Biol       Date:  2009-09-04       Impact factor: 4.475

10.  Natural image coding in V1: how much use is orientation selectivity?

Authors:  Jan Eichhorn; Fabian Sinz; Matthias Bethge
Journal:  PLoS Comput Biol       Date:  2009-04-03       Impact factor: 4.475

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.