Literature DB >> 22784924

On the equivalence of Hopfield networks and Boltzmann Machines.

Adriano Barra1, Alberto Bernacchia, Enrica Santucci, Pierluigi Contucci.   

Abstract

A specific type of neural networks, the Restricted Boltzmann Machines (RBM), are implemented for classification and feature detection in machine learning. They are characterized by separate layers of visible and hidden units, which are able to learn efficiently a generative model of the observed data. We study a "hybrid" version of RBMs, in which hidden units are analog and visible units are binary, and we show that thermodynamics of visible units are equivalent to those of a Hopfield network, in which the N visible units are the neurons and the P hidden units are the learned patterns. We apply the method of stochastic stability to derive the thermodynamics of the model, by considering a formal extension of this technique to the case of multiple sets of stored patterns, which may act as a benchmark for the study of correlated sets. Our results imply that simulating the dynamics of a Hopfield network, requiring the update of N neurons and the storage of N(N-1)/2 synapses, can be accomplished by a hybrid Boltzmann Machine, requiring the update of N+P neurons but the storage of only NP synapses. In addition, the well known glass transition of the Hopfield network has a counterpart in the Boltzmann Machine: it corresponds to an optimum criterion for selecting the relative sizes of the hidden and visible layers, resolving the trade-off between flexibility and generality of the model. The low storage phase of the Hopfield model corresponds to few hidden units and hence a overly constrained RBM, while the spin-glass phase (too many hidden units) corresponds to unconstrained RBM prone to overfitting of the observed data.
Copyright © 2012 Elsevier Ltd. All rights reserved.

Mesh:

Year:  2012        PMID: 22784924     DOI: 10.1016/j.neunet.2012.06.003

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  6 in total

1.  A high-bias, low-variance introduction to Machine Learning for physicists.

Authors:  Pankaj Mehta; Ching-Hao Wang; Alexandre G R Day; Clint Richardson; Marin Bukov; Charles K Fisher; David J Schwab
Journal:  Phys Rep       Date:  2019-03-14       Impact factor: 25.600

2.  The interplay of plasticity and adaptation in neural circuits: a generative model.

Authors:  Alberto Bernacchia
Journal:  Front Synaptic Neurosci       Date:  2014-10-30

3.  Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines.

Authors:  Song Cheng; Jing Chen; Lei Wang
Journal:  Entropy (Basel)       Date:  2018-08-07       Impact factor: 2.524

4.  Occupancy patterns in superorganisms: a spin-glass approach to ant exploration.

Authors:  Javier Cristín; Frederic Bartumeus; Vicenç Méndez; Daniel Campos
Journal:  R Soc Open Sci       Date:  2020-12-16       Impact factor: 2.963

Review 5.  Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks.

Authors:  Chiara Marullo; Elena Agliari
Journal:  Entropy (Basel)       Date:  2020-12-29       Impact factor: 2.524

6.  Multiplex visibility graphs to investigate recurrent neural network dynamics.

Authors:  Filippo Maria Bianchi; Lorenzo Livi; Cesare Alippi; Robert Jenssen
Journal:  Sci Rep       Date:  2017-03-10       Impact factor: 4.379

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.