Literature DB >> 30004730

Role of Synaptic Stochasticity in Training Low-Precision Neural Networks.

Carlo Baldassi1,2,3, Federica Gerace2,4, Hilbert J Kappen5, Carlo Lucibello2,4, Luca Saglietti2,4, Enzo Tartaglione2,4, Riccardo Zecchina1,2,6.   

Abstract

Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension that allows to train discrete deep neural networks is also investigated.

Year:  2018        PMID: 30004730     DOI: 10.1103/PhysRevLett.120.268103

Source DB:  PubMed          Journal:  Phys Rev Lett        ISSN: 0031-9007            Impact factor:   9.161


  2 in total

1.  A high-bias, low-variance introduction to Machine Learning for physicists.

Authors:  Pankaj Mehta; Ching-Hao Wang; Alexandre G R Day; Clint Richardson; Marin Bukov; Charles K Fisher; David J Schwab
Journal:  Phys Rep       Date:  2019-03-14       Impact factor: 25.600

2.  Shaping the learning landscape in neural networks around wide flat minima.

Authors:  Carlo Baldassi; Fabrizio Pittorino; Riccardo Zecchina
Journal:  Proc Natl Acad Sci U S A       Date:  2019-12-23       Impact factor: 11.205

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.