Literature DB >> 34873052

Free recall scaling laws and short-term memory effects in a latching attractor network.

Vezha Boboeva1, Alberto Pezzotta2, Claudia Clopath3.   

Abstract

Despite the complexity of human memory, paradigms like free recall have revealed robust qualitative and quantitative characteristics, such as power laws governing recall capacity. Although abstract random matrix models could explain such laws, the possibility of their implementation in large networks of interacting neurons has so far remained underexplored. We study an attractor network model of long-term memory endowed with firing rate adaptation and global inhibition. Under appropriate conditions, the transitioning behavior of the network from memory to memory is constrained by limit cycles that prevent the network from recalling all memories, with scaling similar to what has been found in experiments. When the model is supplemented with a heteroassociative learning rule, complementing the standard autoassociative learning rule, as well as short-term synaptic facilitation, our model reproduces other key findings in the free recall literature, namely, serial position effects, contiguity and forward asymmetry effects, and the semantic effects found to guide memory recall. The model is consistent with a broad series of manipulations aimed at gaining a better understanding of the variables that affect recall, such as the role of rehearsal, presentation rates, and continuous and/or end-of-list distractor conditions. We predict that recall capacity may be increased with the addition of small amounts of noise, for example, in the form of weak random stimuli during recall. Finally, we predict that, although the statistics of the encoded memories has a strong effect on the recall capacity, the power laws governing recall capacity may still be expected to hold.

Entities:  

Keywords:  attractor network; free recall; latching dynamics; recall capacity

Mesh:

Year:  2021        PMID: 34873052      PMCID: PMC8670499          DOI: 10.1073/pnas.2026092118

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   12.779


  43 in total

1.  Temporal association in asymmetric neural networks.

Authors: 
Journal:  Phys Rev Lett       Date:  1986-12-01       Impact factor: 9.161

2.  Scaling laws of associative memory retrieval.

Authors:  Sandro Romani; Itai Pinkoviezky; Alon Rubin; Misha Tsodyks
Journal:  Neural Comput       Date:  2013-06-18       Impact factor: 2.026

3.  Learning 10,000 pictures.

Authors:  L Standing
Journal:  Q J Exp Psychol       Date:  1973-05       Impact factor: 2.143

4.  A neural model of the dynamic activation of memory.

Authors:  M Herrmann; E Ruppin; M Usher
Journal:  Biol Cybern       Date:  1993       Impact factor: 2.086

Review 5.  Memory Retrieval from First Principles.

Authors:  M Katkov; S Romani; M Tsodyks
Journal:  Neuron       Date:  2017-06-07       Impact factor: 17.173

6.  Is memory search governed by universal principles or idiosyncratic strategies?

Authors:  M Karl Healey; Michael J Kahana
Journal:  J Exp Psychol Gen       Date:  2013-08-19

7.  Reactivation in working memory: an attractor network model of free recall.

Authors:  Anders Lansner; Petter Marklund; Sverker Sikström; Lars-Göran Nilsson
Journal:  PLoS One       Date:  2013-08-30       Impact factor: 3.240

8.  Professional or Amateur? The Phonological Output Buffer as a Working Memory Operator.

Authors:  Neta Haluts; Massimiliano Trippa; Naama Friedmann; Alessandro Treves
Journal:  Entropy (Basel)       Date:  2020-06-15       Impact factor: 2.524

9.  Storage of correlated patterns in standard and bistable Purkinje cell models.

Authors:  Claudia Clopath; Jean-Pierre Nadal; Nicolas Brunel
Journal:  PLoS Comput Biol       Date:  2012-04-26       Impact factor: 4.475

10.  Neural Network Model of Memory Retrieval.

Authors:  Stefano Recanatesi; Mikhail Katkov; Sandro Romani; Misha Tsodyks
Journal:  Front Comput Neurosci       Date:  2015-12-17       Impact factor: 2.380

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.