Literature DB >> 28582684

Compounding as Abstract Operation in Semantic Space: Investigating relational effects through a large-scale, data-driven computational model.

Marco Marelli1, Christina L Gagné2, Thomas L Spalding3.   

Abstract

In many languages, compounding is a fundamental process for the generation of novel words. When this process is productive (as, e.g., in English), native speakers can juxtapose two words to create novel compounds that can be readily understood by other speakers. The present paper proposes a large-scale, data-driven computational system for compound semantic processing based on distributional semantics, the CAOSS model (Compounding as Abstract Operation in Semantic Space). In CAOSS, word meanings are represented as vectors encoding their lexical co-occurrences in a reference corpus. Given two constituent words, their composed representation (the compound) is computed by using matrices representing the abstract properties of constituent roles (modifier vs. head). The matrices are also induced through examples of language usage. The model is then validated against behavioral results concerning the processing of novel compounds, and in particular relational effects on response latencies. The effects of relational priming and relational dominance are considered. CAOSS predictions are shown to pattern with previous results, in terms of both the impact of relational information and the dissociations related to the different constituent roles. The simulations indicate that relational information is implicitly reflected in language usage, suggesting that human speakers can learn these aspects from language experience and automatically apply them to the processing of new word combinations. The present model is flexible enough to emulate this procedure, suggesting that relational effects might emerge as a by-product of nuanced operations across distributional patterns.
Copyright © 2017 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Compound words; Conceptual combination; Distributional semantics; Novel compounds; Relational information

Mesh:

Year:  2017        PMID: 28582684     DOI: 10.1016/j.cognition.2017.05.026

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  6 in total

1.  Language experience shapes relational knowledge of compound words.

Authors:  Daniel Schmidtke; Christina L Gagné; Victor Kuperman; Thomas L Spalding
Journal:  Psychon Bull Rev       Date:  2018-08

Review 2.  Grounding the neurobiology of language in first principles: The necessity of non-language-centric explanations for language comprehension.

Authors:  Uri Hasson; Giovanna Egidi; Marco Marelli; Roel M Willems
Journal:  Cognition       Date:  2018-07-24

Review 3.  From decomposition to distributed theories of morphological processing in reading.

Authors:  Patience Stevens; David C Plaut
Journal:  Psychon Bull Rev       Date:  2022-05-20

4.  Distilling vector space model scores for the assessment of constructed responses with bifactor Inbuilt Rubric method and latent variables.

Authors:  José Ángel Martínez-Huertas; Ricardo Olmos; Guillermo Jorge-Botana; José A León
Journal:  Behav Res Methods       Date:  2022-01-11

5.  Not just form, not just meaning: Words with consistent form-meaning mappings are learned earlier.

Authors:  Giovanni Cassani; Niklas Limacher
Journal:  Q J Exp Psychol (Hove)       Date:  2021-10-21       Impact factor: 2.138

6.  An eye-tracking study of reading long and short novel and lexicalized compound words.

Authors:  Jukka Hyönä; Alexander Pollatsek; Minna Koski; Henri Olkoniemi
Journal:  J Eye Mov Res       Date:  2020-08-04       Impact factor: 0.957

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.