Literature DB >> 32004010

Neural Networks with a Redundant Representation: Detecting the Undetectable.

Elena Agliari1, Francesco Alemanno2,3, Adriano Barra2,4, Martino Centonze2, Alberto Fachechi2,4.   

Abstract

We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4. The latter is known to be able to Hebbian store an amount of patterns scaling as N^{P-1}, where N denotes the number of constituting binary neurons interacting P wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of patterns scaling only linearly with N, while P>2) such a system is able to perform pattern recognition far below the standard signal-to-noise threshold. In particular, a network with P=4 is able to retrieve information whose intensity is O(1) even in the presence of a noise O(sqrt[N]) in the large N limit. This striking skill stems from a redundancy representation of patterns-which is afforded given the (relatively) low-load information storage-and it contributes to explain the impressive abilities in pattern recognition exhibited by new-generation neural networks. The whole theory is developed rigorously, at the replica symmetric level of approximation, and corroborated by signal-to-noise analysis and Monte Carlo simulations.

Year:  2020        PMID: 32004010     DOI: 10.1103/PhysRevLett.124.028301

Source DB:  PubMed          Journal:  Phys Rev Lett        ISSN: 0031-9007            Impact factor:   9.161


  1 in total

Review 1.  Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks.

Authors:  Chiara Marullo; Elena Agliari
Journal:  Entropy (Basel)       Date:  2020-12-29       Impact factor: 2.524

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.