| Literature DB >> 30004730 |
Carlo Baldassi1,2,3, Federica Gerace2,4, Hilbert J Kappen5, Carlo Lucibello2,4, Luca Saglietti2,4, Enzo Tartaglione2,4, Riccardo Zecchina1,2,6.
Abstract
Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension that allows to train discrete deep neural networks is also investigated.Year: 2018 PMID: 30004730 DOI: 10.1103/PhysRevLett.120.268103
Source DB: PubMed Journal: Phys Rev Lett ISSN: 0031-9007 Impact factor: 9.161