| Literature DB >> 33267440 |
Giorgio Gosti1, Viola Folli1, Marco Leonetti1,2, Giancarlo Ruocco1,3.
Abstract
In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.Entities:
Keywords: Hopfield neural networks; pattern storage; recurrent neural networks
Year: 2019 PMID: 33267440 PMCID: PMC7515255 DOI: 10.3390/e21080726
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1Sketch of the vector states space of a Hopfield RNN. Each point of is one of all the possible binary vectors. Among these states, with may be steady states and belong to . (A) In the standard approach to memory storage, each stable state is a single memory state, or a spurious memory . (B) In the approach proposed here, each memory element is represented by a neighborhood of vectors which surrounds the “seed” vector . This neighborhood is obtained by the collection of all vectors that differ from of k-bits at the most. The neighborhood of is defined with the shorthand .
Figure 2Given the neighborhood of each memory , the dashed lines show how the retrieval rate changes for a sample of neighbors at a distance d from . The red line shows the average of over all . The vertical line represents the limit which corresponds to the neighborhood bound .
Figure 3Given each memory and its neighborhood , the dashed lines show the average distance from for the attractors of neighbors at a distance from . The red line shows the average of over all . The vertical and the horizontal line represent the limit and the neighborhood bound.
Figure 4Average computed over eight Hopfield RNN replicas for different N, with , and . The vertical line represents the limit and the neighborhood bound.
Figure 5Mean neighborhood retrieval rate R for eight Hopfield RNN replicas for different N values averaged over all memory states , with , and . The horizontal line represents the optimal recall of all state in the neighberhood.