Literature DB >> 30029804

Why does language not emerge until the second year?

Rhodri Cusack1, Conor J Wild2, Leire Zubiaurre-Elorza3, Annika C Linke4.   

Abstract

From their second year, infants typically begin to show rapid acquisition of receptive and expressive language. Here, we ask why these language skills do not begin to develop earlier. One evolutionary hypothesis is that infants are born when many brains systems are immature and not yet functioning, including those critical to language, because human infants have large have a large head and their mother's pelvis size is limited, necessitating an early birth. An alternative proposal, inspired by discoveries in machine learning, is that the language systems are mature enough to function but need auditory experience to develop effective representations of speech, before the language functions that manifest in behaviour can emerge. Growing evidence, in particular from neuroimaging, is supporting this latter hypothesis. We have previously shown with magnetic resonance imaging (MRI) that the acoustic radiation, carrying rich information to auditory cortex, is largely mature by 1 month, and using functional MRI (fMRI) that auditory cortex is processing many complex features of natural sounds by 3 months. However, speech perception relies upon a network of regions beyond auditory cortex, and it is not established if this network is mature. Here we measure the maturity of the speech network using functional connectivity with fMRI in infants at 3 months (N = 6) and 9 months (N = 7), and in an adult comparison group (N = 15). We find that functional connectivity in speech networks is mature at 3 months, suggesting that the delay in the onset of language is not due to brain immaturity but rather to the time needed to develop representations through experience. Future avenues for the study of language development are proposed, and the implications for clinical care and infant education are discussed.
Copyright © 2018. Published by Elsevier B.V.

Entities:  

Mesh:

Year:  2018        PMID: 30029804     DOI: 10.1016/j.heares.2018.05.004

Source DB:  PubMed          Journal:  Hear Res        ISSN: 0378-5955            Impact factor:   3.208


  4 in total

1.  Naturalistic Audio-Movies and Narrative Synchronize "Visual" Cortices across Congenitally Blind But Not Sighted Individuals.

Authors:  Rita E Loiotile; Rhodri Cusack; Marina Bedny
Journal:  J Neurosci       Date:  2019-09-23       Impact factor: 6.167

2.  Language level predicts perceptual categorization of complex reversible events in children.

Authors:  Wolfram Hinzen; Elisa Peinado; Scott James Perry; Kristen Schroeder; Mariana Lombardo
Journal:  Heliyon       Date:  2022-07-14

3.  Automatic segmentation of the core of the acoustic radiation in humans.

Authors:  Malin Siegbahn; Cecilia Engmér Berglin; Rodrigo Moreno
Journal:  Front Neurol       Date:  2022-09-23       Impact factor: 4.086

4.  Auditory representation of learned sound sequences in motor regions of the macaque brain.

Authors:  Denis Archakov; Iain DeWitt; Paweł Kuśmierek; Michael Ortiz-Rios; Daniel Cameron; Ding Cui; Elyse L Morin; John W VanMeter; Mikko Sams; Iiro P Jääskeläinen; Josef P Rauschecker
Journal:  Proc Natl Acad Sci U S A       Date:  2020-06-15       Impact factor: 11.205

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.