| Literature DB >> 26427059 |
Christoph Salge1, Nihat Ay2, Daniel Polani3, Mikhail Prokopenko4.
Abstract
We propose a model that explains the reliable emergence of power laws (e.g., Zipf's law) during the development of different human languages. The model incorporates the principle of least effort in communications, minimizing a combination of the information-theoretic communication inefficiency and direct signal cost. We prove a general relationship, for all optimal languages, between the signal cost distribution and the resulting distribution of signals. Zipf's law then emerges for logarithmic signal cost distributions, which is the cost distribution expected for words constructed from letters or phonemes.Entities:
Mesh:
Year: 2015 PMID: 26427059 PMCID: PMC4591018 DOI: 10.1371/journal.pone.0139475
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Fig 1A log-plot of the 1000 cheapest words created from a 10 letter alphabet, ordered by their cost rank.
Word cost is a sum of individual letter cost, and letter cost is between 1.0 and 2.0 units.