| Literature DB >> 16447385 |
Abstract
Latent semantic analysis (LSA) is a model of knowledge representation for words. It works by applying dimension reduction to local co-occurrence data from a large collection of documents after performing singular value decomposition on it. When the reduction is applied, the system forms condensed representations for the words that incorporate higher order associations. The higher order associations are primarily responsible for any semantic similarity between words in LSA. In this article, a memory model is described that creates semantic representations for words that are similar in form to those created by LSA. However, instead of applying dimension reduction, the model builds the representations by using a retrieval mechanism from a well-known account of episodic memory.Entities:
Mesh:
Year: 2005 PMID: 16447385 DOI: 10.3758/bf03196761
Source DB: PubMed Journal: Psychon Bull Rev ISSN: 1069-9384