Literature DB >> 28957017

Why Do Similarity Matching Objectives Lead to Hebbian/Anti-Hebbian Networks?

Cengiz Pehlevan1, Anirvan M Sengupta2, Dmitri B Chklovskii3.   

Abstract

Modeling self-organization of neural networks for unsupervised learning using Hebbian and anti-Hebbian plasticity has a long history in neuroscience. Yet derivations of single-layer networks with such local learning rules from principled optimization objectives became possible only recently, with the introduction of similarity matching objectives. What explains the success of similarity matching objectives in deriving neural networks with local learning rules? Here, using dimensionality reduction as an example, we introduce several variable substitutions that illuminate the success of similarity matching. We show that the full network objective may be optimized separately for each synapse using local learning rules in both the offline and online settings. We formalize the long-standing intuition of the rivalry between Hebbian and anti-Hebbian rules by formulating a min-max optimization problem. We introduce a novel dimensionality reduction objective using fractional matrix exponents. To illustrate the generality of our approach, we apply it to a novel formulation of dimensionality reduction combined with whitening. We confirm numerically that the networks with learning rules derived from principled objectives perform better than those with heuristic learning rules.

Entities:  

Mesh:

Year:  2017        PMID: 28957017     DOI: 10.1162/neco_a_01018

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  10 in total

1.  The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.

Authors:  Carolin Scholl; Michael E Rule; Matthias H Hennig
Journal:  PLoS Comput Biol       Date:  2021-10-11       Impact factor: 4.475

2.  Structured random receptive fields enable informative sensory encodings.

Authors:  Biraj Pandey; Marius Pachitariu; Bingni W Brunton; Kameron Decker Harris
Journal:  PLoS Comput Biol       Date:  2022-10-10       Impact factor: 4.779

Review 3.  Adaptive control of synaptic plasticity integrates micro- and macroscopic network function.

Authors:  Daniel N Scott; Michael J Frank
Journal:  Neuropsychopharmacology       Date:  2022-08-29       Impact factor: 8.294

4.  Place cells may simply be memory cells: Memory compression leads to spatial tuning and history dependence.

Authors:  Marcus K Benna; Stefano Fusi
Journal:  Proc Natl Acad Sci U S A       Date:  2021-12-21       Impact factor: 12.779

5.  Unsupervised changes in core object recognition behavior are predicted by neural plasticity in inferior temporal cortex.

Authors:  Xiaoxuan Jia; Ha Hong; James J DiCarlo
Journal:  Elife       Date:  2021-06-11       Impact factor: 8.140

6.  Contrastive Similarity Matching for Supervised Learning.

Authors:  Shanshan Qin; Nayantara Mudur; Cengiz Pehlevan
Journal:  Neural Comput       Date:  2021-04-13       Impact factor: 2.026

7.  Unsupervised learning by competing hidden units.

Authors:  Dmitry Krotov; John J Hopfield
Journal:  Proc Natl Acad Sci U S A       Date:  2019-03-29       Impact factor: 11.205

8.  Self-healing codes: How stable neural populations can track continually reconfiguring neural representations.

Authors:  Michael E Rule; Timothy O'Leary
Journal:  Proc Natl Acad Sci U S A       Date:  2022-02-15       Impact factor: 12.779

9.  Learning to represent signals spike by spike.

Authors:  Wieland Brendel; Ralph Bourdoukan; Pietro Vertechi; Christian K Machens; Sophie Denève
Journal:  PLoS Comput Biol       Date:  2020-03-16       Impact factor: 4.475

10.  An avian cortical circuit for chunking tutor song syllables into simple vocal-motor units.

Authors:  Emily L Mackevicius; Michael T L Happ; Michale S Fee
Journal:  Nat Commun       Date:  2020-10-06       Impact factor: 14.919

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.