Literature DB >> 29324240

Cognitive science in the era of artificial intelligence: A roadmap for reverse-engineering the infant language-learner.

Emmanuel Dupoux1.   

Abstract

Spectacular progress in the information processing sciences (machine learning, wearable sensors) promises to revolutionize the study of cognitive development. Here, we analyse the conditions under which 'reverse engineering' language development, i.e., building an effective system that mimics infant's achievements, can contribute to our scientific understanding of early language development. We argue that, on the computational side, it is important to move from toy problems to the full complexity of the learning situation, and take as input as faithful reconstructions of the sensory signals available to infants as possible. On the data side, accessible but privacy-preserving repositories of home data have to be setup. On the psycholinguistic side, specific tests have to be constructed to benchmark humans and machines at different linguistic levels. We discuss the feasibility of this approach and present an overview of current results.
Copyright © 2017 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Artificial intelligence; Computational modeling; Corpus analysis; Early language acquisition; Infant development; Language bootstrapping; Machine learning; Speech; psycholinguistics

Mesh:

Year:  2018        PMID: 29324240     DOI: 10.1016/j.cognition.2017.11.008

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  8 in total

1.  Early phonetic learning without phonetic categories: Insights from large-scale simulations on realistic input.

Authors:  Thomas Schatz; Naomi H Feldman; Sharon Goldwater; Xuan-Nga Cao; Emmanuel Dupoux
Journal:  Proc Natl Acad Sci U S A       Date:  2021-02-09       Impact factor: 11.205

2.  Learning Through Processing: Toward an Integrated Approach to Early Word Learning.

Authors:  Stephan C Meylan; Elika Bergelson
Journal:  Annu Rev Linguist       Date:  2021-10-05

3.  Oh, Behave!: PRESIDENTIAL ADDRESS, XXth International Conference on Infant Studies New Orleans, LA, US May 2016.

Authors:  Karen E Adolph
Journal:  Infancy       Date:  2020-06-18

4.  Generative Adversarial Phonology: Modeling Unsupervised Phonetic and Phonological Learning With Neural Networks.

Authors:  Gašper Beguš
Journal:  Front Artif Intell       Date:  2020-07-08

5.  Brain-inspired model for early vocal learning and correspondence matching using free-energy optimization.

Authors:  Alexandre Pitti; Mathias Quoy; Sofiane Boucenna; Catherine Lavandier
Journal:  PLoS Comput Biol       Date:  2021-02-18       Impact factor: 4.475

6.  Inferring the nature of linguistic computations in the brain.

Authors:  Sanne Ten Oever; Karthikeya Kaushik; Andrea E Martin
Journal:  PLoS Comput Biol       Date:  2022-07-28       Impact factor: 4.779

7.  Synthesizing theories of human language with Bayesian program induction.

Authors:  Kevin Ellis; Adam Albright; Armando Solar-Lezama; Joshua B Tenenbaum; Timothy J O'Donnell
Journal:  Nat Commun       Date:  2022-08-30       Impact factor: 17.694

8.  Unsupervised Few-Shot Feature Learning via Self-Supervised Training.

Authors:  Zilong Ji; Xiaolong Zou; Tiejun Huang; Si Wu
Journal:  Front Comput Neurosci       Date:  2020-10-14       Impact factor: 2.380

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.