Literature DB >> 25620807

Optimal Feature Selection in High-Dimensional Discriminant Analysis.

Mladen Kolar1, Han Liu2.   

Abstract

We consider the high-dimensional discriminant analysis problem. For this problem, different methods have been proposed and justified by establishing exact convergence rates for the classification risk, as well as the ℓ2 convergence results to the discriminative rule. However, sharp theoretical analysis for the variable selection performance of these procedures have not been established, even though model interpretation is of fundamental importance in scientific data analysis. This paper bridges the gap by providing sharp sufficient conditions for consistent variable selection using the sparse discriminant analysis (Mai et al., 2012). Through careful analysis, we establish rates of convergence that are significantly faster than the best known results and admit an optimal scaling of the sample size n, dimensionality p, and sparsity level s in the high-dimensional setting. Sufficient conditions are complemented by the necessary information theoretic limits on the variable selection problem in the context of high-dimensional discriminant analysis. Exploiting a numerical equivalence result, our method also establish the optimal results for the ROAD estimator (Fan et al., 2012) and the sparse optimal scaling estimator (Clemmensen et al., 2011). Furthermore, we analyze an exhaustive search procedure, whose performance serves as a benchmark, and show that it is variable selection consistent under weaker conditions. Extensive simulations demonstrating the sharpness of the bounds are also provided.

Entities:  

Keywords:  discriminant analysis; high-dimensional statistics; optimal rates of convergence; variable selection

Year:  2015        PMID: 25620807      PMCID: PMC4302965          DOI: 10.1109/TIT.2014.2381241

Source DB:  PubMed          Journal:  IEEE Trans Inf Theory        ISSN: 0018-9448            Impact factor:   2.501


  7 in total

1.  Sparse linear discriminant analysis for simultaneous testing for the significance of a gene set/pathway and gene selection.

Authors:  Michael C Wu; Lingsong Zhang; Zhaoxi Wang; David C Christiani; Xihong Lin
Journal:  Bioinformatics       Date:  2009-01-25       Impact factor: 6.937

2.  High Dimensional Classification Using Features Annealed Independence Rules.

Authors:  Jianqing Fan; Yingying Fan
Journal:  Ann Stat       Date:  2008       Impact factor: 4.028

3.  A ROAD to Classification in High Dimensional Space.

Authors:  Jianqing Fan; Yang Feng; Xin Tong
Journal:  J R Stat Soc Series B Stat Methodol       Date:  2012-04-12       Impact factor: 4.488

4.  On Consistency and Sparsity for Principal Components Analysis in High Dimensions.

Authors:  Iain M Johnstone; Arthur Yu Lu
Journal:  J Am Stat Assoc       Date:  2009-06-01       Impact factor: 5.033

5.  Improved centroids estimation for the nearest shrunken centroid classifier.

Authors:  Sijian Wang; Ji Zhu
Journal:  Bioinformatics       Date:  2007-03-24       Impact factor: 6.937

6.  Penalized classification using Fisher's linear discriminant.

Authors:  Daniela M Witten; Robert Tibshirani
Journal:  J R Stat Soc Series B Stat Methodol       Date:  2011-11       Impact factor: 4.488

7.  Covariance-regularized regression and classification for high-dimensional problems.

Authors:  Daniela M Witten; Robert Tibshirani
Journal:  J R Stat Soc Series B Stat Methodol       Date:  2009-02-20       Impact factor: 4.488

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.