Literature DB >> 21911913

A least-squares framework for Component Analysis.

Fernando De la Torre1.   

Abstract

Over the last century, Component Analysis (CA) methods such as Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Canonical Correlation Analysis (CCA), Locality Preserving Projections (LPP), and Spectral Clustering (SC) have been extensively used as a feature extraction step for modeling, classification, visualization, and clustering. CA techniques are appealing because many can be formulated as eigen-problems, offering great potential for learning linear and nonlinear representations of data in closed-form. However, the eigen-formulation often conceals important analytic and computational drawbacks of CA techniques, such as solving generalized eigen-problems with rank deficient matrices (e.g., small sample size problem), lacking intuitive interpretation of normalization factors, and understanding commonalities and differences between CA methods. This paper proposes a unified least-squares framework to formulate many CA methods. We show how PCA, LDA, CCA, LPP, SC, and its kernel and regularized extensions correspond to a particular instance of least-squares weighted kernel reduced rank regression (LS--WKRRR). The LS-WKRRR formulation of CA methods has several benefits: 1) provides a clean connection between many CA techniques and an intuitive framework to understand normalization factors; 2) yields efficient numerical schemes to solve CA techniques; 3) overcomes the small sample size problem; 4) provides a framework to easily extend CA methods. We derive weighted generalizations of PCA, LDA, SC, and CCA, and several new CA techniques.

Mesh:

Year:  2012        PMID: 21911913     DOI: 10.1109/TPAMI.2011.184

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  8 in total

1.  Wavelet-based Computationally-Efficient Computer-Aided Characterization of Liver Steatosis using Conventional B-mode Ultrasound Images.

Authors:  Manar N Amin; Muhammad A Rushdi; Raghda N Marzaban; Ayman Yosry; Kang Kim; Ahmed M Mahmoud
Journal:  Biomed Signal Process Control       Date:  2019-04-05       Impact factor: 3.880

2.  Demixed principal component analysis of neural population data.

Authors:  Dmitry Kobak; Wieland Brendel; Christos Constantinidis; Claudia E Feierstein; Adam Kepecs; Zachary F Mainen; Xue-Lian Qi; Ranulfo Romo; Naoshige Uchida; Christian K Machens
Journal:  Elife       Date:  2016-04-12       Impact factor: 8.140

3.  Joint feature-sample selection and robust diagnosis of Parkinson's disease from MRI data.

Authors:  Ehsan Adeli; Feng Shi; Le An; Chong-Yaw Wee; Guorong Wu; Tao Wang; Dinggang Shen
Journal:  Neuroimage       Date:  2016-06-10       Impact factor: 6.556

4.  Semi-Supervised Discriminative Classification Robust to Sample-Outliers and Feature-Noises.

Authors:  Ehsan Adeli; Kim-Han Thung; Le An; Guorong Wu; Feng Shi; Tao Wang; Dinggang Shen
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2018-01-17       Impact factor: 6.226

5.  Discriminative Scale Learning (DiScrn): Applications to Prostate Cancer Detection from MRI and Needle Biopsies.

Authors:  Haibo Wang; Satish Viswanath; Anant Madabhushi
Journal:  Sci Rep       Date:  2017-09-28       Impact factor: 4.379

6.  Low-rank graph optimization for multi-view dimensionality reduction.

Authors:  Youcheng Qian; Xueyan Yin; Jun Kong; Jianzhong Wang; Wei Gao
Journal:  PLoS One       Date:  2019-12-18       Impact factor: 3.240

7.  Flattening the curves: on-off lock-down strategies for COVID-19 with an application to Brazil.

Authors:  Luís Tarrataca; Claudia Mazza Dias; Diego Barreto Haddad; Edilson Fernandes De Arruda
Journal:  J Math Ind       Date:  2021-01-06

8.  Deep fusion of gray level co-occurrence matrices for lung nodule classification.

Authors:  Ahmed Saihood; Hossein Karshenas; Ahmad Reza Naghsh Nilchi
Journal:  PLoS One       Date:  2022-09-29       Impact factor: 3.752

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.