Literature DB >> 33067397

Overparameterized neural networks implement associative memory.

Adityanarayanan Radhakrishnan1,2, Mikhail Belkin3, Caroline Uhler4,2.   

Abstract

Identifying computational mechanisms for memorization and retrieval of data is a long-standing problem at the intersection of machine learning and neuroscience. Our main finding is that standard overparameterized deep neural networks trained using standard optimization methods implement such a mechanism for real-valued data. We provide empirical evidence that 1) overparameterized autoencoders store training samples as attractors and thus iterating the learned map leads to sample recovery, and that 2) the same mechanism allows for encoding sequences of examples and serves as an even more efficient mechanism for memory than autoencoding. Theoretically, we prove that when trained on a single example, autoencoders store the example as an attractor. Lastly, by treating a sequence encoder as a composition of maps, we prove that sequence encoding provides a more efficient mechanism for memory than autoencoding.
Copyright © 2020 the Author(s). Published by PNAS.

Entities:  

Keywords:  associative memory; autoencoders; neural networks; overparameterization; sequence encoders

Mesh:

Year:  2020        PMID: 33067397     DOI: 10.1073/pnas.2005013117

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   11.205


  3 in total

1.  Associative Memories via Predictive Coding.

Authors:  Tommaso Salvatori; Yuhang Song; Yujian Hong; Lei Sha; Simon Frieder; Zhenghua Xu; Rafal Bogacz; Thomas Lukasiewicz
Journal:  Adv Neural Inf Process Syst       Date:  2021-12-01

2.  Causal network models of SARS-CoV-2 expression and aging to identify candidates for drug repurposing.

Authors:  Anastasiya Belyaeva; Louis Cammarata; Adityanarayanan Radhakrishnan; Chandler Squires; Karren Dai Yang; G V Shivashankar; Caroline Uhler
Journal:  Nat Commun       Date:  2021-02-15       Impact factor: 14.919

3.  Simple, fast, and flexible framework for matrix completion with infinite width neural networks.

Authors:  Adityanarayanan Radhakrishnan; George Stefanakis; Mikhail Belkin; Caroline Uhler
Journal:  Proc Natl Acad Sci U S A       Date:  2022-04-11       Impact factor: 12.779

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.