| Literature DB >> 10860707 |
Abstract
This paper places models of language evolution within the framework of information theory. We study how signals become associated with meaning. If there is a probability of mistaking signals for each other, then evolution leads to an error limit: increasing the number of signals does not increase the fitness of a language beyond a certain limit. This error limit can be overcome by word formation: a linear increase of the word length leads to an exponential increase of the maximum fitness. We develop a general model of word formation and demonstrate the connection between the error limit and Shannon's noisy coding theorem. Copyright 2000 Academic Press.Mesh:
Year: 2000 PMID: 10860707 DOI: 10.1006/jtbi.2000.2053
Source DB: PubMed Journal: J Theor Biol ISSN: 0022-5193 Impact factor: 2.691