Literature DB >> 19618982

Integrating experiential and distributional data to learn semantic representations.

Mark Andrews1, Gabriella Vigliocco, David Vinson.   

Abstract

The authors identify 2 major types of statistical data from which semantic representations can be learned. These are denoted as experiential data and distributional data. Experiential data are derived by way of experience with the physical world and comprise the sensory-motor data obtained through sense receptors. Distributional data, by contrast, describe the statistical distribution of words across spoken and written language. The authors claim that experiential and distributional data represent distinct data types and that each is a nontrivial source of semantic information. Their theoretical proposal is that human semantic representations are derived from an optimal statistical combination of these 2 data types. Using a Bayesian probabilistic model, they demonstrate how word meanings can be learned by treating experiential and distributional data as a single joint distribution and learning the statistical structure that underlies it. The semantic representations that are learned in this manner are measurably more realistic-as verified by comparison to a set of human-based measures of semantic representation-than those available from either data type individually or from both sources independently. This is not a result of merely using quantitatively more data, but rather it is because experiential and distributional data are qualitatively distinct, yet intercorrelated, types of data. The semantic representations that are learned are based on statistical structures that exist both within and between the experiential and distributional data types. Copyright (c) 2009 APA, all rights reserved.

Entities:  

Mesh:

Year:  2009        PMID: 19618982     DOI: 10.1037/a0016261

Source DB:  PubMed          Journal:  Psychol Rev        ISSN: 0033-295X            Impact factor:   8.934


  69 in total

1.  The contributions of language and experience to the representation of abstract and concrete words: different weights but similar organizations.

Authors:  J Frederico Marques; Ludmila D Nunes
Journal:  Mem Cognit       Date:  2012-11

2.  Perceptual and motor attribute ratings for 559 object concepts.

Authors:  Ben D Amsel; Thomas P Urbach; Marta Kutas
Journal:  Behav Res Methods       Date:  2012-12

3.  Abstract Conceptual Feature Ratings Predict Gaze Within Written Word Arrays: Evidence From a Visual Wor(l)d Paradigm.

Authors:  Silvia Primativo; Jamie Reilly; Sebastian J Crutch
Journal:  Cogn Sci       Date:  2016-02-22

4.  Language networks associated with computerized semantic indices.

Authors:  Serguei V S Pakhomov; David T Jones; David S Knopman
Journal:  Neuroimage       Date:  2014-10-12       Impact factor: 6.556

Review 5.  Three symbol ungrounding problems: Abstract concepts and the future of embodied cognition.

Authors:  Guy Dove
Journal:  Psychon Bull Rev       Date:  2016-08

Review 6.  Taxonomic and thematic semantic systems.

Authors:  Daniel Mirman; Jon-Frederick Landrigan; Allison E Britt
Journal:  Psychol Bull       Date:  2017-03-23       Impact factor: 17.737

Review 7.  Language as a disruptive technology: abstract concepts, embodiment and the flexible mind.

Authors:  Guy Dove
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2018-08-05       Impact factor: 6.237

Review 8.  The multifaceted abstract brain.

Authors:  Rutvik H Desai; Megan Reilly; Wessel van Dam
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2018-08-05       Impact factor: 6.237

9.  Learning abstract words and concepts: insights from developmental language disorder.

Authors:  Marta Ponari; Courtenay Frazier Norbury; Armand Rotaru; Alessandro Lenci; Gabriella Vigliocco
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2018-08-05       Impact factor: 6.237

Review 10.  Boundaries to grounding abstract concepts.

Authors:  Diane Pecher; René Zeelenberg
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2018-08-05       Impact factor: 6.237

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.