Literature DB >> 33733138

Self-Net: Lifelong Learning via Continual Self-Modeling.

Jaya Krishna Mandivarapu1, Blake Camp1, Rolando Estrada1.   

Abstract

Learning a set of tasks over time, also known as continual learning (CL), is one of the most challenging problems in artificial intelligence. While recent approaches achieve some degree of CL in deep neural networks, they either (1) store a new network (or an equivalent number of parameters) for each new task, (2) store training data from previous tasks, or (3) restrict the network's ability to learn new tasks. To address these issues, we propose a novel framework, Self-Net, that uses an autoencoder to learn a set of low-dimensional representations of the weights learned for different tasks. We demonstrate that these low-dimensional vectors can then be used to generate high-fidelity recollections of the original weights. Self-Net can incorporate new tasks over time with little retraining, minimal loss in performance for older tasks, and without storing prior training data. We show that our technique achieves over 10X storage compression in a continual fashion, and that it outperforms state-of-the-art approaches on numerous datasets, including continual versions of MNIST, CIFAR10, CIFAR100, Atari, and task-incremental CORe50. To the best of our knowledge, we are the first to use autoencoders to sequentially encode sets of network weights to enable continual learning.
Copyright © 2020 Mandivarapu, Camp and Estrada.

Entities:  

Keywords:  autoencoders; catastrophic forgetting; continual learning; deep learning; manifold learning

Year:  2020        PMID: 33733138      PMCID: PMC7861283          DOI: 10.3389/frai.2020.00019

Source DB:  PubMed          Journal:  Front Artif Intell        ISSN: 2624-8212


  9 in total

Review 1.  Hippocampal replay in the awake state: a potential substrate for memory consolidation and retrieval.

Authors:  Margaret F Carr; Shantanu P Jadhav; Loren M Frank
Journal:  Nat Neurosci       Date:  2011-02       Impact factor: 24.884

Review 2.  Interplay of hippocampus and prefrontal cortex in memory.

Authors:  Alison R Preston; Howard Eichenbaum
Journal:  Curr Biol       Date:  2013-09-09       Impact factor: 10.834

Review 3.  Continual lifelong learning with neural networks: A review.

Authors:  German I Parisi; Ronald Kemker; Jose L Part; Christopher Kanan; Stefan Wermter
Journal:  Neural Netw       Date:  2019-02-06

4.  Overcoming catastrophic forgetting in neural networks.

Authors:  James Kirkpatrick; Razvan Pascanu; Neil Rabinowitz; Joel Veness; Guillaume Desjardins; Andrei A Rusu; Kieran Milan; John Quan; Tiago Ramalho; Agnieszka Grabska-Barwinska; Demis Hassabis; Claudia Clopath; Dharshan Kumaran; Raia Hadsell
Journal:  Proc Natl Acad Sci U S A       Date:  2017-03-14       Impact factor: 11.205

5.  Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.

Authors:  Nicolas Y Masse; Gregory D Grant; David J Freedman
Journal:  Proc Natl Acad Sci U S A       Date:  2018-10-12       Impact factor: 11.205

6.  The hippocampal memory indexing theory.

Authors:  T J Teyler; P DiScenna
Journal:  Behav Neurosci       Date:  1986-04       Impact factor: 1.912

7.  Note on the quadratic penalties in elastic weight consolidation.

Authors:  Ferenc Huszár
Journal:  Proc Natl Acad Sci U S A       Date:  2018-02-20       Impact factor: 11.205

8.  Lifelong Learning of Spatiotemporal Representations With Dual-Memory Recurrent Self-Organization.

Authors:  German I Parisi; Jun Tani; Cornelius Weber; Stefan Wermter
Journal:  Front Neurorobot       Date:  2018-11-28       Impact factor: 2.650

Review 9.  What Learning Systems do Intelligent Agents Need? Complementary Learning Systems Theory Updated.

Authors:  Dharshan Kumaran; Demis Hassabis; James L McClelland
Journal:  Trends Cogn Sci       Date:  2016-07       Impact factor: 20.229

  9 in total
  1 in total

1.  Learning over a lifetime.

Authors:  Neil Savage
Journal:  Nature       Date:  2022-07-20       Impact factor: 69.504

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.