Literature DB >> 32386140

Intrinsic Grassmann Averages for Online Linear, Robust and Nonlinear Subspace Learning.

Rudrasis Chakraborty, Liu Yang, Soren Hauberg, Baba Vemuri.   

Abstract

Principal Component Analysis (PCA) and Kernel Principal Component Analysis (KPCA) are fundamental methods in machine learning for dimensionality reduction. The former is a technique for finding this approximation in finite dimensions and the latter is often in an infinite dimensional Reproducing Kernel Hilbert-space (RKHS). In this paper, we present a geometric framework for computing the principal linear subspaces in both situations as well as for the robust PCA case, that amounts to computing the intrinsic average on the space of all subspaces: the Grassmann manifold. Points on this manifold are defined as the subspaces spanned by K -tuples of observations. The intrinsic Grassmann average of these subspaces are shown to coincide with the principal components of the observations when they are drawn from a Gaussian distribution. We show similar results in the RKHS case and provide an efficient algorithm for computing the projection onto the this average subspace. The result is a method akin to KPCA which is substantially faster. Further, we present a novel online version of the KPCA using our geometric framework. Competitive performance of all our algorithms are demonstrated on a variety of real and synthetic data sets.

Year:  2020        PMID: 32386140     DOI: 10.1109/TPAMI.2020.2992392

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  1 in total

1.  An Online Riemannian PCA for Stochastic Canonical Correlation Analysis.

Authors:  Zihang Meng; Rudrasis Chakraborty; Vikas Singh
Journal:  Adv Neural Inf Process Syst       Date:  2021-12
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.