Literature DB >> 25973548

A Hebbian/Anti-Hebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data.

Cengiz Pehlevan1, Tao Hu2, Dmitri B Chklovskii3.   

Abstract

Neural network models of early sensory processing typically reduce the dimensionality of streaming input data. Such networks learn the principal subspace, in the sense of principal component analysis, by adjusting synaptic weights according to activity-dependent learning rules. When derived from a principled cost function, these rules are nonlocal and hence biologically implausible. At the same time, biologically plausible local rules have been postulated rather than derived from a principled cost function. Here, to bridge this gap, we derive a biologically plausible network for subspace learning on streaming data by minimizing a principled cost function. In a departure from previous work, where cost was quantified by the representation, or reconstruction, error, we adopt a multidimensional scaling cost function for streaming data. The resulting algorithm relies only on biologically plausible Hebbian and anti-Hebbian local learning rules. In a stochastic setting, synaptic weights converge to a stationary state, which projects the input data onto the principal subspace. If the data are generated by a nonstationary distribution, the network can track the principal subspace. Thus, our result makes a step toward an algorithmic theory of neural computation.

Mesh:

Year:  2015        PMID: 25973548     DOI: 10.1162/NECO_a_00745

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  10 in total

1.  Stable population coding for working memory coexists with heterogeneous neural dynamics in prefrontal cortex.

Authors:  John D Murray; Alberto Bernacchia; Nicholas A Roy; Christos Constantinidis; Ranulfo Romo; Xiao-Jing Wang
Journal:  Proc Natl Acad Sci U S A       Date:  2016-12-27       Impact factor: 11.205

2.  Postural Representations of the Hand in the Primate Sensorimotor Cortex.

Authors:  James M Goodman; Gregg A Tabot; Alex S Lee; Aneesha K Suresh; Alexander T Rajan; Nicholas G Hatsopoulos; Sliman Bensmaia
Journal:  Neuron       Date:  2019-10-24       Impact factor: 17.173

3.  Causal Inference and Explaining Away in a Spiking Network.

Authors:  Rubén Moreno-Bote; Jan Drugowitsch
Journal:  Sci Rep       Date:  2015-12-01       Impact factor: 4.379

Review 4.  Hebbian plasticity requires compensatory processes on multiple timescales.

Authors:  Friedemann Zenke; Wulfram Gerstner
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2017-03-05       Impact factor: 6.237

5.  Unsupervised learning by competing hidden units.

Authors:  Dmitry Krotov; John J Hopfield
Journal:  Proc Natl Acad Sci U S A       Date:  2019-03-29       Impact factor: 11.205

6.  Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis.

Authors:  Takuya Isomura; Taro Toyoizumi
Journal:  Sci Rep       Date:  2018-01-30       Impact factor: 4.379

7.  An Oscillatory Neural Autoencoder Based on Frequency Modulation and Multiplexing.

Authors:  Karthik Soman; Vignesh Muralidharan; V Srinivasa Chakravarthy
Journal:  Front Comput Neurosci       Date:  2018-07-10       Impact factor: 2.380

8.  A hierarchical anti-Hebbian network model for the formation of spatial cells in three-dimensional space.

Authors:  Karthik Soman; Srinivasa Chakravarthy; Michael M Yartsev
Journal:  Nat Commun       Date:  2018-10-02       Impact factor: 14.919

9.  Saccade Velocity Driven Oscillatory Network Model of Grid Cells.

Authors:  Ankur Chauhan; Karthik Soman; V Srinivasa Chakravarthy
Journal:  Front Comput Neurosci       Date:  2019-01-10       Impact factor: 2.380

10.  Learning to represent signals spike by spike.

Authors:  Wieland Brendel; Ralph Bourdoukan; Pietro Vertechi; Christian K Machens; Sophie Denève
Journal:  PLoS Comput Biol       Date:  2020-03-16       Impact factor: 4.475

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.