Literature DB >> 33767252

An entropic associative memory.

Luis A Pineda1, Gibrán Fuentes2, Rafael Morales3.   

Abstract

Natural memories are associative, declarative and distributed, and memory retrieval is a constructive operation. In addition, cues of objects that are not contained in the memory are rejected directly. Symbolic computing memories resemble natural memories in their declarative character, and information can be stored and recovered explicitly; however, they are reproductive rather than constructive, and lack the associative and distributed properties. Sub-symbolic memories developed within the connectionist or artificial neural networks paradigm are associative and distributed, but lack the declarative property, the capability of rejecting objects that are not included in the memory, and memory retrieval is also reproductive. In this paper we present a memory model that sustains the five properties of natural memories. We use Relational-Indeterminate Computing to model associative memory registers that hold distributed representations of individual objects. This mode of computing has an intrinsic computing entropy which measures the indeterminacy of representations. This parameter determines the operational characteristics of the memory. Associative registers are embedded in an architecture that maps concrete images expressed in modality specific buffers into abstract representations and vice versa. The framework has been used to model a visual memory holding the representations of hand-written digits. The system has been tested with a set of memory recognition and retrieval experiments with complete and severely occluded images. The results show that there is a range of entropy values, not too low and not too high, in which associative memory registers have a satisfactory performance. The experiments were implemented in a simulation using a standard computer with a GPU, but a parallel architecture may be built where the memory operations would take a very reduced number of computing steps.

Entities:  

Year:  2021        PMID: 33767252     DOI: 10.1038/s41598-021-86270-7

Source DB:  PubMed          Journal:  Sci Rep        ISSN: 2045-2322            Impact factor:   4.379


  4 in total

1.  Reducing the dimensionality of data with neural networks.

Authors:  G E Hinton; R R Salakhutdinov
Journal:  Science       Date:  2006-07-28       Impact factor: 47.728

Review 2.  Deep learning.

Authors:  Yann LeCun; Yoshua Bengio; Geoffrey Hinton
Journal:  Nature       Date:  2015-05-28       Impact factor: 49.962

3.  On associative memory.

Authors:  G Palm
Journal:  Biol Cybern       Date:  1980       Impact factor: 2.086

4.  Constructing an Associative Memory System Using Spiking Neural Network.

Authors:  Hu He; Yingjie Shang; Xu Yang; Yingze Di; Jiajun Lin; Yimeng Zhu; Wenhao Zheng; Jinfeng Zhao; Mengyao Ji; Liya Dong; Ning Deng; Yunlin Lei; Zenghao Chai
Journal:  Front Neurosci       Date:  2019-07-03       Impact factor: 4.677

  4 in total
  2 in total

1.  Entropic associative memory for manuscript symbols.

Authors:  Rafael Morales; Noé Hernández; Ricardo Cruz; Victor D Cruz; Luis A Pineda
Journal:  PLoS One       Date:  2022-08-04       Impact factor: 3.752

2.  Weighted entropic associative memory and phonetic learning.

Authors:  Luis A Pineda; Rafael Morales
Journal:  Sci Rep       Date:  2022-10-06       Impact factor: 4.996

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.