Literature DB >> 28095200

STDP-Compatible Approximation of Backpropagation in an Energy-Based Model.

Yoshua Bengio1, Thomas Mesnard2, Asja Fischer3, Saizheng Zhang4, Yuhuai Wu5.   

Abstract

We show that Langevin Markov chain Monte Carlo inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating error gradients into internal layers, similar to backpropagation. The backpropagated error is with respect to output units that have received an outside driving force pushing them away from the stationary point. Backpropagated error gradients correspond to temporal derivatives with respect to the activation of hidden units. These lead to a weight update proportional to the product of the presynaptic firing rate and the temporal rate of change of the postsynaptic firing rate. Simulations and a theoretical argument suggest that this rate-based update rule is consistent with those associated with spike-timing-dependent plasticity. The ideas presented in this article could be an element of a theory for explaining how brains perform credit assignment in deep hierarchies as efficiently as backpropagation does, with neural computation corresponding to both approximate inference in continuous-valued latent variables and error backpropagation, at the same time.

Year:  2017        PMID: 28095200     DOI: 10.1162/NECO_a_00934

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  10 in total

1.  Can the Brain Do Backpropagation? -Exact Implementation of Backpropagation in Predictive Coding Networks.

Authors:  Yuhang Song; Thomas Lukasiewicz; Zhenghua Xu; Rafal Bogacz
Journal:  Adv Neural Inf Process Syst       Date:  2020

2.  Neurons learn by predicting future activity.

Authors:  Artur Luczak; Bruce L McNaughton; Yoshimasa Kubo
Journal:  Nat Mach Intell       Date:  2022-01-25

Review 3.  Deep Learning With Spiking Neurons: Opportunities and Challenges.

Authors:  Michael Pfeiffer; Thomas Pfeil
Journal:  Front Neurosci       Date:  2018-10-25       Impact factor: 4.677

4.  Brain-inspired classical conditioning model.

Authors:  Yuxuan Zhao; Yi Zeng; Guang Qiao
Journal:  iScience       Date:  2020-12-25

5.  Predictive Neuronal Adaptation as a Basis for Consciousness.

Authors:  Artur Luczak; Yoshimasa Kubo
Journal:  Front Syst Neurosci       Date:  2022-01-11

6.  Self-backpropagation of synaptic modifications elevates the efficiency of spiking and artificial neural networks.

Authors:  Tielin Zhang; Xiang Cheng; Shuncheng Jia; Mu-Ming Poo; Yi Zeng; Bo Xu
Journal:  Sci Adv       Date:  2021-10-20       Impact factor: 14.136

7.  Training spiking neuronal networks to perform motor control using reinforcement and evolutionary learning.

Authors:  Daniel Haşegan; Matt Deible; Christopher Earl; David D'Onofrio; Hananel Hazan; Haroon Anwar; Samuel A Neymotin
Journal:  Front Comput Neurosci       Date:  2022-09-30       Impact factor: 3.387

Review 8.  Theories of Error Back-Propagation in the Brain.

Authors:  James C R Whittington; Rafal Bogacz
Journal:  Trends Cogn Sci       Date:  2019-01-28       Impact factor: 20.229

9.  A Computational Theory for the Emergence of Grammatical Categories in Cortical Dynamics.

Authors:  Dario Dematties; Silvio Rizzi; George K Thiruvathukal; Mauricio David Pérez; Alejandro Wainselboim; B Silvano Zanutto
Journal:  Front Neural Circuits       Date:  2020-04-16       Impact factor: 3.492

10.  The color phi phenomenon: Not so special, after all?

Authors:  Lars Keuninckx; Axel Cleeremans
Journal:  PLoS Comput Biol       Date:  2021-09-03       Impact factor: 4.475

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.