| Literature DB >> 21585472 |
Stephani Foraker1, Terry Regier, Naveen Khetarpal, Amy Perfors, Joshua Tenenbaum.
Abstract
It is widely held that children's linguistic input underdetermines the correct grammar, and that language learning must therefore be guided by innate linguistic constraints. Here, we show that a Bayesian model can learn a standard poverty-of-stimulus example, anaphoric one, from realistic input by relying on indirect evidence, without a linguistic constraint assumed to be necessary. Our demonstration does, however, assume other linguistic knowledge; thus, we reduce the problem of learning anaphoric one to that of learning this other knowledge. We discuss whether this other knowledge may itself be acquired without linguistic constraints.Entities:
Year: 2009 PMID: 21585472 DOI: 10.1111/j.1551-6709.2009.01014.x
Source DB: PubMed Journal: Cogn Sci ISSN: 0364-0213