Literature DB >> 34484658

Beyond input: Language learners produce novel relative clause types without exposure.

Adam M Morgan1, Victor S Ferreira2.   

Abstract

Syntax famously consists of abstract hierarchical representations, essentially instructions for combining words into larger units like sentences. Less famously, most theories of syntax also assume a higher level of abstract representation. Representations at this level comprise instructions for creating the hierarchical representations used to create sentences. To date, however there is no experimental evidence for this additional level of abstraction. Here, we explain why the existence of such representations would imply that, under certain circumstances, speakers should be able to produce structures they have never been exposed to, and we test this prediction directly. We ask: Given the right type of input, can speakers learn a syntactic structure without direct exposure? In particular, different types of relative clauses have different surface word orders. These may be represented in two ways: with many individual representations or one general representation. If the latter, then learning one type of relative clause amounts to learning all types. We teach participants a novel grammar for only some relative clause types (e.g., just subject relative clauses) and test their knowledge of other types (e.g., object relative clauses). Across experiments, participants consistently produced untrained types, implicating the existence of this higher level of abstract syntactic knowledge.

Entities:  

Keywords:  artificial language learning; generalization; relative clauses; syntax

Year:  2021        PMID: 34484658      PMCID: PMC8412168          DOI: 10.1080/20445911.2021.1928678

Source DB:  PubMed          Journal:  J Cogn Psychol (Hove)        ISSN: 2044-5911


  35 in total

1.  The item-based nature of children's early syntactic development.

Authors: 
Journal:  Trends Cogn Sci       Date:  2000-04       Impact factor: 20.229

2.  Processing relative clauses in Chinese.

Authors:  Franny Hsiao; Edward Gibson
Journal:  Cognition       Date:  2003-11

3.  Learning biases predict a word order universal.

Authors:  Jennifer Culbertson; Paul Smolensky; Géraldine Legendre
Journal:  Cognition       Date:  2011-12-28

4.  The influence of contextual diversity on word learning.

Authors:  Brendan T Johns; Melody Dye; Michael N Jones
Journal:  Psychon Bull Rev       Date:  2016-08

5.  Evolved structure of language shows lineage-specific trends in word-order universals.

Authors:  Michael Dunn; Simon J Greenhill; Stephen C Levinson; Russell D Gray
Journal:  Nature       Date:  2011-04-13       Impact factor: 49.962

6.  Linking production and comprehension processes: the case of relative clauses.

Authors:  Silvia P Gennari; Maryellen C Macdonald
Journal:  Cognition       Date:  2009-02-11

Review 7.  Linguistic complexity: locality of syntactic dependencies.

Authors:  E Gibson
Journal:  Cognition       Date:  1998-08

8.  Random effects structure for confirmatory hypothesis testing: Keep it maximal.

Authors:  Dale J Barr; Roger Levy; Christoph Scheepers; Harry J Tily
Journal:  J Mem Lang       Date:  2013-04       Impact factor: 3.059

9.  A Bayesian model of biases in artificial language learning: the case of a word-order universal.

Authors:  Jennifer Culbertson; Paul Smolensky
Journal:  Cogn Sci       Date:  2012-09-10

10.  Subject/object processing asymmetries in Korean relative clauses: Evidence from ERP data.

Authors:  Nayoung Kwon; Robert Kluender; Marta Kutas; Maria Polinsky
Journal:  Language (Baltim)       Date:  2013-09
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.