Literature DB >> 21919780

On the relation of slow feature analysis and Laplacian eigenmaps.

Henning Sprekeler1.   

Abstract

The past decade has seen a rise of interest in Laplacian eigenmaps (LEMs) for nonlinear dimensionality reduction. LEMs have been used in spectral clustering, in semisupervised learning, and for providing efficient state representations for reinforcement learning. Here, we show that LEMs are closely related to slow feature analysis (SFA), a biologically inspired, unsupervised learning algorithm originally designed for learning invariant visual representations. We show that SFA can be interpreted as a function approximation of LEMs, where the topological neighborhoods required for LEMs are implicitly defined by the temporal structure of the data. Based on this relation, we propose a generalization of SFA to arbitrary neighborhood relations and demonstrate its applicability for spectral clustering. Finally, we review previous work with the goal of providing a unifying view on SFA and LEMs.

Entities:  

Mesh:

Year:  2011        PMID: 21919780     DOI: 10.1162/NECO_a_00214

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  3 in total

1.  Learning Structures: Predictive Representations, Replay, and Generalization.

Authors:  Ida Momennejad
Journal:  Curr Opin Behav Sci       Date:  2020-05-05

2.  An intrinsic value system for developing multiple invariant representations with incremental slowness learning.

Authors:  Matthew Luciw; Varun Kompella; Sohrob Kazerounian; Juergen Schmidhuber
Journal:  Front Neurorobot       Date:  2013-05-30       Impact factor: 2.650

3.  Sustained firing of model central auditory neurons yields a discriminative spectro-temporal representation for natural sounds.

Authors:  Michael A Carlin; Mounya Elhilali
Journal:  PLoS Comput Biol       Date:  2013-03-28       Impact factor: 4.475

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.