| Literature DB >> 32748786 |
Oscar C González1, Yury Sokolov1, Giri P Krishnan1, Jean Erik Delanois1,2, Maxim Bazhenov1.
Abstract
Continual learning remains an unsolved problem in artificial neural networks. The brain has evolved mechanisms to prevent catastrophic forgetting of old knowledge during new training. Building upon data suggesting the importance of sleep in learning and memory, we tested a hypothesis that sleep protects old memories from being forgotten after new learning. In the thalamocortical model, training a new memory interfered with previously learned old memories leading to degradation and forgetting of the old memory traces. Simulating sleep after new learning reversed the damage and enhanced old and new memories. We found that when a new memory competed for previously allocated neuronal/synaptic resources, sleep replay changed the synaptic footprint of the old memory to allow overlapping neuronal populations to store multiple memories. Our study predicts that memory storage is dynamic, and sleep enables continual learning by combining consolidation of new memory traces with reconsolidation of old memory traces to minimize interference.Entities:
Keywords: catastrophic forgetting; continual learning; memory consolidation; neural network; neuroscience; none; sleep
Mesh:
Year: 2020 PMID: 32748786 PMCID: PMC7440920 DOI: 10.7554/eLife.51005
Source DB: PubMed Journal: Elife ISSN: 2050-084X Impact factor: 8.140