Literature DB >> 28668210

Associative Learning Should Go Deep.

Esther Mondragón1, Eduardo Alonso2, Niklas Kokkola3.   

Abstract

Conditioning, how animals learn to associate two or more events, is one of the most influential paradigms in learning theory. It is nevertheless unclear how current models of associative learning can accommodate complex phenomena without ad hoc representational assumptions. We propose to embrace deep neural networks to negotiate this problem.
Copyright © 2017 Elsevier Ltd. All rights reserved.

Keywords:  associative learning; deep neural networks

Mesh:

Year:  2017        PMID: 28668210     DOI: 10.1016/j.tics.2017.06.001

Source DB:  PubMed          Journal:  Trends Cogn Sci        ISSN: 1364-6613            Impact factor:   20.229


  5 in total

1.  The Successor Representation: Its Computational Logic and Neural Substrates.

Authors:  Samuel J Gershman
Journal:  J Neurosci       Date:  2018-07-13       Impact factor: 6.167

Review 2.  Hallucinations and Strong Priors.

Authors:  Philip R Corlett; Guillermo Horga; Paul C Fletcher; Ben Alderson-Day; Katharina Schmack; Albert R Powers
Journal:  Trends Cogn Sci       Date:  2018-12-21       Impact factor: 20.229

Review 3.  Believing in dopamine.

Authors:  Samuel J Gershman; Naoshige Uchida
Journal:  Nat Rev Neurosci       Date:  2019-09-30       Impact factor: 34.870

4.  Dissociating Representations of Time and Number in Reinforcement-Rate Learning by Deletion of the GluA1 AMPA Receptor Subunit in Mice.

Authors:  Joseph M Austen; Corran Pickering; Rolf Sprengel; David J Sanderson
Journal:  Psychol Sci       Date:  2021-01-04

5.  Evaluating the progress of deep learning for visual relational concepts.

Authors:  Sebastian Stabinger; David Peer; Justus Piater; Antonio Rodríguez-Sánchez
Journal:  J Vis       Date:  2021-10-05       Impact factor: 2.240

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.