Literature DB >> 24074456

Quantifying the information in the long-range order of words: semantic structures and universal linguistic constraints.

Marcelo A Montemurro1.   

Abstract

We review some recent progress on the characterisation of long-range patterns of word use in language using methods from information theory. In particular, two levels of structure in language are considered. The first level corresponds to the patterns of words usage over different contextual domains. A direct application of information theory to quantify the specificity of words across different sections of a linguistic sequence leads to a measure of semantic information. Moreover, a natural scale emerges that characterises the typical size of semantic structures. Since the information measure is made up of additive contributions from individual words, it is possible to rank the words according to their overall weight in the total information. This allows the extraction of keywords most relevant to the semantic content of the sequence without any prior knowledge of the language. The second level considered is the complex structure of correlations among words in linguistic sequences. The degree of order in language can be quantified by means of the entropy. Reliable estimates of the entropy were obtained from corpora of texts from several linguistic families by means of lossless compression algorithms. The value of the entropy fluctuates across different languages since it depends on linguistic organisation at various levels. However, when a measure of relative entropy that specifically quantifies the degree of word ordering in language is estimated, it presents an almost constant value over all the linguistic families studied. This suggests that the entropy of word ordering is a novel quantitative linguistic universal.
Copyright © 2013 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Entropy; Information theory; Language; Linguistic universal; Semantic information

Mesh:

Year:  2013        PMID: 24074456     DOI: 10.1016/j.cortex.2013.08.008

Source DB:  PubMed          Journal:  Cortex        ISSN: 0010-9452            Impact factor:   4.027


  4 in total

1.  Using information-theoretic measures to characterize the structure of the writing system: the case of orthographic-phonological regularities in English.

Authors:  Noam Siegelman; Devin M Kearns; Jay G Rueckl
Journal:  Behav Res Methods       Date:  2020-06

2.  Long-Range Memory in Literary Texts: On the Universal Clustering of the Rare Words.

Authors:  Kumiko Tanaka-Ishii; Armin Bunde
Journal:  PLoS One       Date:  2016-11-28       Impact factor: 3.240

3.  Do neural nets learn statistical laws behind natural language?

Authors:  Shuntaro Takahashi; Kumiko Tanaka-Ishii
Journal:  PLoS One       Date:  2017-12-29       Impact factor: 3.240

4.  Evaluation of Error Production in Animal Fluency and Its Relationship to Frontal Tracts in Normal Aging and Mild Alzheimer's Disease: A Combined LDA and Time-Course Analysis Investigation.

Authors:  Yoshihiro Itaguchi; Susana A Castro-Chavira; Knut Waterloo; Stein Harald Johnsen; Claudia Rodríguez-Aranda
Journal:  Front Aging Neurosci       Date:  2022-01-12       Impact factor: 5.750

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.