Literature DB >> 16447385

Using context to build semantics.

Peter J Kwantes1.   

Abstract

Latent semantic analysis (LSA) is a model of knowledge representation for words. It works by applying dimension reduction to local co-occurrence data from a large collection of documents after performing singular value decomposition on it. When the reduction is applied, the system forms condensed representations for the words that incorporate higher order associations. The higher order associations are primarily responsible for any semantic similarity between words in LSA. In this article, a memory model is described that creates semantic representations for words that are similar in form to those created by LSA. However, instead of applying dimension reduction, the model builds the representations by using a retrieval mechanism from a well-known account of episodic memory.

Entities:  

Mesh:

Year:  2005        PMID: 16447385     DOI: 10.3758/bf03196761

Source DB:  PubMed          Journal:  Psychon Bull Rev        ISSN: 1069-9384


  2 in total

1.  Modeling lexical decision and word naming as a retrieval process.

Authors:  P J Kwantes; D J Mewhort
Journal:  Can J Exp Psychol       Date:  1999-12

2.  A context noise model of episodic word recognition.

Authors:  S Dennis; M S Humphreys
Journal:  Psychol Rev       Date:  2001-04       Impact factor: 8.934

  2 in total
  11 in total

1.  The dimensionality of discourse.

Authors:  Isidoros Doxas; Simon Dennis; William L Oliver
Journal:  Proc Natl Acad Sci U S A       Date:  2010-03-01       Impact factor: 11.205

2.  An instance theory of associative learning.

Authors:  Randall K Jamieson; Matthew J C Crump; Samuel D Hannah
Journal:  Learn Behav       Date:  2012-03       Impact factor: 1.986

3.  A literature-based assessment of concept pairs as a measure of semantic relatedness.

Authors:  T Elizabeth Workman; Graciela Rosemblat; Marcelo Fiszman; Thomas C Rindflesch
Journal:  AMIA Annu Symp Proc       Date:  2013-11-16

4.  A model of the transition to behavioural and cognitive modernity using reflexively autocatalytic networks.

Authors:  Liane Gabora; Mike Steel
Journal:  J R Soc Interface       Date:  2020-10-28       Impact factor: 4.118

Review 5.  Bridging the theoretical gap between semantic representation models without the pressure of a ranking: some lessons learnt from LSA.

Authors:  Guillermo Jorge-Botana; Ricardo Olmos; José María Luzón
Journal:  Cogn Process       Date:  2019-09-25

6.  Encoding sequential information in semantic space models: comparing holographic reduced representation and random permutation.

Authors:  Gabriel Recchia; Magnus Sahlgren; Pentti Kanerva; Michael N Jones
Journal:  Comput Intell Neurosci       Date:  2015-04-07

7.  Using a high-dimensional graph of semantic space to model relationships among words.

Authors:  Alice F Jackson; Donald J Bolger
Journal:  Front Psychol       Date:  2014-05-12

8.  Searching for Semantic Knowledge: A Vector Space Semantic Analysis of the Feature Generation Task.

Authors:  Rebecca A Cutler; Melissa C Duff; Sean M Polyn
Journal:  Front Hum Neurosci       Date:  2019-10-04       Impact factor: 3.169

9.  Different influences on lexical priming for integrative, thematic, and taxonomic relations.

Authors:  Lara L Jones; Sabrina Golonka
Journal:  Front Hum Neurosci       Date:  2012-07-11       Impact factor: 3.169

10.  Mental mechanisms for topics identification.

Authors:  Louis Massey
Journal:  Comput Intell Neurosci       Date:  2014-03-13
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.