Literature DB >> 16089765

Convergence of stochastic learning in perceptrons with binary synapses.

Walter Senn1, Stefano Fusi.   

Abstract

The efficacy of a biological synapse is naturally bounded, and at some resolution, and is discrete at the latest level of single vesicles. The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered (i.e., the synapses are immediately modified by each pattern): the trace of old memories decays exponentially with the number of new memories (palimpsest property). Moreover, finding the discrete synaptic strengths which enable the classification of linearly separable patterns is a combinatorially hard problem known to be NP complete. In this paper we show that learning with discrete (binary) synapses is nevertheless possible with high probability if a randomly selected fraction of synapses is modified following each stimulus presentation (slow stochastic learning). As an additional constraint, the synapses are only changed if the output neuron does not give the desired response, as in the case of classical perceptron learning. We prove that for linearly separable classes of patterns the stochastic learning algorithm converges with arbitrary high probability in a finite number of presentations, provided that the number of neurons encoding the patterns is large enough. The stochastic learning algorithm is successfully applied to a standard classification problem of nonlinearly separable patterns by using multiple, stochastically independent output units, with an achieved performance which is comparable to the maximal ones reached for the task.

Entities:  

Mesh:

Year:  2005        PMID: 16089765     DOI: 10.1103/PhysRevE.71.061907

Source DB:  PubMed          Journal:  Phys Rev E Stat Nonlin Soft Matter Phys        ISSN: 1539-3755


  8 in total

1.  Efficient supervised learning in networks with binary synapses.

Authors:  Carlo Baldassi; Alfredo Braunstein; Nicolas Brunel; Riccardo Zecchina
Journal:  Proc Natl Acad Sci U S A       Date:  2007-06-20       Impact factor: 11.205

2.  In situ unsupervised learning using stochastic switching in magneto-electric magnetic tunnel junctions.

Authors:  Indranil Chakraborty; Amogh Agrawal; Akhilesh Jaiswal; Gopalakrishnan Srinivasan; Kaushik Roy
Journal:  Philos Trans A Math Phys Eng Sci       Date:  2019-12-23       Impact factor: 4.226

3.  Spintronic Nanodevices for Bioinspired Computing.

Authors:  Julie Grollier; Damien Querlioz; Mark D Stiles
Journal:  Proc IEEE Inst Electr Electron Eng       Date:  2016-09-08       Impact factor: 10.961

4.  Stability of discrete memory states to stochastic fluctuations in neuronal systems.

Authors:  Paul Miller; Xiao-Jing Wang
Journal:  Chaos       Date:  2006-06       Impact factor: 3.642

5.  Recurrent network of perceptrons with three state synapses achieves competitive classification on real inputs.

Authors:  Yali Amit; Jacob Walker
Journal:  Front Comput Neurosci       Date:  2012-06-22       Impact factor: 2.380

6.  Dynamical Characteristics of Recurrent Neuronal Networks Are Robust Against Low Synaptic Weight Resolution.

Authors:  Stefan Dasbach; Tom Tetzlaff; Markus Diesmann; Johanna Senk
Journal:  Front Neurosci       Date:  2021-12-24       Impact factor: 4.677

7.  Stochastic learning in oxide binary synaptic device for neuromorphic computing.

Authors:  Shimeng Yu; Bin Gao; Zheng Fang; Hongyu Yu; Jinfeng Kang; H-S Philip Wong
Journal:  Front Neurosci       Date:  2013-10-31       Impact factor: 4.677

8.  Multiclass Classification by Adaptive Network of Dendritic Neurons with Binary Synapses Using Structural Plasticity.

Authors:  Shaista Hussain; Arindam Basu
Journal:  Front Neurosci       Date:  2016-03-31       Impact factor: 4.677

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.