Literature DB >> 25684971

Sparse PCA with Oracle Property.

Quanquan Gu1, Zhaoran Wang2, Han Liu3.   

Abstract

In this paper, we study the estimation of the k-dimensional sparse principal subspace of covariance matrix Σ in the high-dimensional setting. We aim to recover the oracle principal subspace solution, i.e., the principal subspace estimator obtained assuming the true support is known a priori. To this end, we propose a family of estimators based on the semidefinite relaxation of sparse PCA with novel regularizations. In particular, under a weak assumption on the magnitude of the population projection matrix, one estimator within this family exactly recovers the true support with high probability, has exact rank-k, and attains a [Formula: see text] statistical rate of convergence with s being the subspace sparsity level and n the sample size. Compared to existing support recovery results for sparse PCA, our approach does not hinge on the spiked covariance model or the limited correlation condition. As a complement to the first estimator that enjoys the oracle property, we prove that, another estimator within the family achieves a sharper statistical rate of convergence than the standard semidefinite relaxation of sparse PCA, even when the previous assumption on the magnitude of the projection matrix is violated. We validate the theoretical results by numerical experiments on synthetic datasets.

Entities:  

Year:  2014        PMID: 25684971      PMCID: PMC4326026     

Source DB:  PubMed          Journal:  Adv Neural Inf Process Syst        ISSN: 1049-5258


  6 in total

1.  Tighten after Relax: Minimax-Optimal Sparse PCA in Polynomial Time.

Authors:  Zhaoran Wang; Huanran Lu; Han Liu
Journal:  Adv Neural Inf Process Syst       Date:  2014

2.  A STRICTLY CONTRACTIVE PEACEMAN-RACHFORD SPLITTING METHOD FOR CONVEX PROGRAMMING.

Authors:  He Bingsheng; Han Liu; Zhaoran Wang; Xiaoming Yuan
Journal:  SIAM J Optim       Date:  2014-07       Impact factor: 2.850

3.  On Consistency and Sparsity for Principal Components Analysis in High Dimensions.

Authors:  Iain M Johnstone; Arthur Yu Lu
Journal:  J Am Stat Assoc       Date:  2009-06-01       Impact factor: 5.033

4.  COORDINATE DESCENT ALGORITHMS FOR NONCONVEX PENALIZED REGRESSION, WITH APPLICATIONS TO BIOLOGICAL FEATURE SELECTION.

Authors:  Patrick Breheny; Jian Huang
Journal:  Ann Appl Stat       Date:  2011-01-01       Impact factor: 2.083

5.  OPTIMAL COMPUTATIONAL AND STATISTICAL RATES OF CONVERGENCE FOR SPARSE NONCONVEX LEARNING PROBLEMS.

Authors:  Zhaoran Wang; Han Liu; Tong Zhang
Journal:  Ann Stat       Date:  2014       Impact factor: 4.028

6.  MINIMAX BOUNDS FOR SPARSE PCA WITH NOISY HIGH-DIMENSIONAL DATA.

Authors:  Aharon Birnbaum; Iain M Johnstone; Boaz Nadler; Debashis Paul
Journal:  Ann Stat       Date:  2013-06       Impact factor: 4.028

  6 in total
  1 in total

1.  Stochastic convex sparse principal component analysis.

Authors:  Inci M Baytas; Kaixiang Lin; Fei Wang; Anil K Jain; Jiayu Zhou
Journal:  EURASIP J Bioinform Syst Biol       Date:  2016-09-09
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.