Literature DB >> 32458523

Exploring What Is Encoded in Distributional Word Vectors: A Neurobiologically Motivated Analysis.

Akira Utsumi1.   

Abstract

The pervasive use of distributional semantic models or word embeddings for both cognitive modeling and practical application is because of their remarkable ability to represent the meanings of words. However, relatively little effort has been made to explore what types of information are encoded in distributional word vectors. Knowing the internal knowledge embedded in word vectors is important for cognitive modeling using distributional semantic models. Therefore, in this paper, we attempt to identify the knowledge encoded in word vectors by conducting a computational experiment using Binder et al.'s (2016) featural conceptual representations based on neurobiologically motivated attributes. In an experiment, these conceptual vectors are predicted from text-based word vectors using a neural network and linear transformation, and prediction performance is compared among various types of information. The analysis demonstrates that abstract information is generally predicted more accurately by word vectors than perceptual and spatiotemporal information, and specifically, the prediction accuracy of cognitive and social information is higher. Emotional information is also found to be successfully predicted for abstract words. These results indicate that language can be a major source of knowledge about abstract attributes, and they support the recent view that emphasizes the importance of language for abstract concepts. Furthermore, we show that word vectors can capture some types of perceptual and spatiotemporal information about concrete concepts and some relevant word categories. This suggests that language statistics can encode more perceptual knowledge than often expected.
© 2020 Cognitive Science Society, Inc.

Keywords:  Abstract words; Computational modeling; Conceptual representation; Distributional semantic models; Language-derived information; Word embedding; Word vectors

Year:  2020        PMID: 32458523     DOI: 10.1111/cogs.12844

Source DB:  PubMed          Journal:  Cogn Sci        ISSN: 0364-0213


  3 in total

1.  Semantic projection recovers rich human knowledge of multiple object features from word embeddings.

Authors:  Gabriel Grand; Idan Asher Blank; Francisco Pereira; Evelina Fedorenko
Journal:  Nat Hum Behav       Date:  2022-04-14

2.  Modelling brain representations of abstract concepts.

Authors:  Daniel Kaiser; Arthur M Jacobs; Radoslaw M Cichy
Journal:  PLoS Comput Biol       Date:  2022-02-04       Impact factor: 4.475

3.  A test of indirect grounding of abstract concepts using multimodal distributional semantics.

Authors:  Akira Utsumi
Journal:  Front Psychol       Date:  2022-10-04
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.