Literature DB >> 27315762

What Learning Systems do Intelligent Agents Need? Complementary Learning Systems Theory Updated.

Dharshan Kumaran1, Demis Hassabis2, James L McClelland3.   

Abstract

We update complementary learning systems (CLS) theory, which holds that intelligent agents must possess two learning systems, instantiated in mammalians in neocortex and hippocampus. The first gradually acquires structured knowledge representations while the second quickly learns the specifics of individual experiences. We broaden the role of replay of hippocampal memories in the theory, noting that replay allows goal-dependent weighting of experience statistics. We also address recent challenges to the theory and extend it by showing that recurrent activation of hippocampal traces can support some forms of generalization and that neocortical learning can be rapid for information that is consistent with known structure. Finally, we note the relevance of the theory to the design of artificial intelligent agents, highlighting connections between neuroscience and machine learning.
Copyright © 2016 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  artificial intelligence; hippocampus; learning; memory

Mesh:

Year:  2016        PMID: 27315762     DOI: 10.1016/j.tics.2016.05.004

Source DB:  PubMed          Journal:  Trends Cogn Sci        ISSN: 1364-6613            Impact factor:   20.229


  76 in total

1.  mPFC spindle cycles organize sparse thalamic activation and recently active CA1 cells during non-REM sleep.

Authors:  Carmen Varela; Matthew A Wilson
Journal:  Elife       Date:  2020-06-11       Impact factor: 8.140

Review 2.  Mechanisms of systems memory consolidation during sleep.

Authors:  Jens G Klinzing; Niels Niethard; Jan Born
Journal:  Nat Neurosci       Date:  2019-08-26       Impact factor: 24.884

3.  Overcoming catastrophic forgetting in neural networks.

Authors:  James Kirkpatrick; Razvan Pascanu; Neil Rabinowitz; Joel Veness; Guillaume Desjardins; Andrei A Rusu; Kieran Milan; John Quan; Tiago Ramalho; Agnieszka Grabska-Barwinska; Demis Hassabis; Claudia Clopath; Dharshan Kumaran; Raia Hadsell
Journal:  Proc Natl Acad Sci U S A       Date:  2017-03-14       Impact factor: 11.205

Review 4.  Direct Fit to Nature: An Evolutionary Perspective on Biological and Artificial Neural Networks.

Authors:  Uri Hasson; Samuel A Nastase; Ariel Goldstein
Journal:  Neuron       Date:  2020-02-05       Impact factor: 17.173

5.  Toward an Integration of Deep Learning and Neuroscience.

Authors:  Adam H Marblestone; Greg Wayne; Konrad P Kording
Journal:  Front Comput Neurosci       Date:  2016-09-14       Impact factor: 2.380

6.  A Non-parametric Approach to the Overall Estimate of Cognitive Load Using NIRS Time Series.

Authors:  Soheil Keshmiri; Hidenobu Sumioka; Ryuji Yamazaki; Hiroshi Ishiguro
Journal:  Front Hum Neurosci       Date:  2017-02-03       Impact factor: 3.169

7.  Integration of new information in memory: new insights from a complementary learning systems perspective.

Authors:  James L McClelland; Bruce L McNaughton; Andrew K Lampinen
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2020-04-06       Impact factor: 6.237

Review 8.  Mechanisms of neural organization and rhythmogenesis during hippocampal and cortical ripples.

Authors:  Sam McKenzie; Noam Nitzan; Daniel F English
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2020-04-06       Impact factor: 6.237

Review 9.  If deep learning is the answer, what is the question?

Authors:  Andrew Saxe; Stephanie Nelli; Christopher Summerfield
Journal:  Nat Rev Neurosci       Date:  2020-11-16       Impact factor: 34.870

Review 10.  Abstraction and generalization in statistical learning: implications for the relationship between semantic types and episodic tokens.

Authors:  Gerry T M Altmann
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2017-01-05       Impact factor: 6.237

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.