Literature DB >> 24102126

High-dimensional feature selection by feature-wise kernelized Lasso.

Makoto Yamada1, Wittawat Jitkrittum, Leonid Sigal, Eric P Xing, Masashi Sugiyama.   

Abstract

The goal of supervised feature selection is to find a subset of input features that are responsible for predicting output values. The least absolute shrinkage and selection operator (Lasso) allows computationally efficient feature selection based on linear dependency between input features and output values. In this letter, we consider a feature-wise kernelized Lasso for capturing nonlinear input-output dependency. We first show that with particular choices of kernel functions, nonredundant features with strong statistical dependence on output values can be found in terms of kernel-based independence measures such as the Hilbert-Schmidt independence criterion. We then show that the globally optimal solution can be efficiently computed; this makes the approach scalable to high-dimensional problems. The effectiveness of the proposed method is demonstrated through feature selection experiments for classification and regression with thousands of features.

Mesh:

Year:  2013        PMID: 24102126     DOI: 10.1162/NECO_a_00537

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  26 in total

1.  Scalable Nonparametric Prescreening Method for Searching Higher-Order Genetic Interactions Underlying Quantitative Traits.

Authors:  Juho A J Kontio; Mikko J Sillanpää
Journal:  Genetics       Date:  2019-10-04       Impact factor: 4.562

2.  Estimating Linear and Nonlinear Gene Coexpression Networks by Semiparametric Neighborhood Selection.

Authors:  Juho A J Kontio; Marko J Rinta-Aho; Mikko J Sillanpää
Journal:  Genetics       Date:  2020-05-15       Impact factor: 4.562

3.  Computational approach for deriving cancer progression roadmaps from static sample data.

Authors:  Yijun Sun; Jin Yao; Le Yang; Runpu Chen; Norma J Nowak; Steve Goodison
Journal:  Nucleic Acids Res       Date:  2017-05-19       Impact factor: 16.971

4.  Parsimony in Protein Conformational Change.

Authors:  Brynmor K Chapman; Omar Davulcu; Jack J Skalicky; Rafael P Brüschweiler; Michael S Chapman
Journal:  Structure       Date:  2015-06-18       Impact factor: 5.006

5.  Deep learning-based identification of genetic variants: application to Alzheimer's disease classification.

Authors:  Taeho Jo; Kwangsik Nho; Paula Bice; Andrew J Saykin
Journal:  Brief Bioinform       Date:  2022-03-10       Impact factor: 11.622

6.  Feature selection for kernel methods in systems biology.

Authors:  Céline Brouard; Jérôme Mariette; Rémi Flamary; Nathalie Vialaneix
Journal:  NAR Genom Bioinform       Date:  2022-03-07

7.  LassoNet: Neural Networks with Feature Sparsity.

Authors:  Ismael Lemhadri; Feng Ruan; Robert Tibshirani
Journal:  Proc Mach Learn Res       Date:  2021-04

8.  Effective Cancer Subtype and Stage Prediction via Dropfeature-DNNs.

Authors:  Zhong Chen; Wensheng Zhang; Hongwen Deng; Kun Zhang
Journal:  IEEE/ACM Trans Comput Biol Bioinform       Date:  2022-02-03       Impact factor: 3.710

9.  Is demography destiny? Application of machine learning techniques to accurately predict population health outcomes from a minimal demographic dataset.

Authors:  Wei Luo; Thin Nguyen; Melanie Nichols; Truyen Tran; Santu Rana; Sunil Gupta; Dinh Phung; Svetha Venkatesh; Steve Allender
Journal:  PLoS One       Date:  2015-05-04       Impact factor: 3.240

10.  Texture Analysis of DCE-MRI Intratumoral Subregions to Identify Benign and Malignant Breast Tumors.

Authors:  Bin Zhang; Lirong Song; Jiandong Yin
Journal:  Front Oncol       Date:  2021-07-08       Impact factor: 6.244

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.