Literature DB >> 15387244

The generalized LASSO.

Volker Roth1.   

Abstract

In the last few years, the support vector machine (SVM) method has motivated new interest in kernel regression techniques. Although the SVM has been shown to exhibit excellent generalization properties in many experiments, it suffers from several drawbacks, both of a theoretical and a technical nature: the absence of probabilistic outputs, the restriction to Mercer kernels, and the steep growth of the number of support vectors with increasing size of the training set. In this paper, we present a different class of kernel regressors that effectively overcome the above problems. We call this approach generalized LASSO regression. It has a clear probabilistic interpretation, can handle learning sets that are corrupted by outliers, produces extremely sparse solutions, and is capable of dealing with large-scale problems. For regression functionals which can be modeled as iteratively reweighted least-squares (IRLS) problems, we present a highly efficient algorithm with guaranteed global convergence. This defies a unique framework for sparse regression models in the very rich class of IRLS models, including various types of robust regression models and logistic regression. Performance studies for many standard benchmark datasets effectively demonstrate the advantages of this model over related approaches.

Mesh:

Year:  2004        PMID: 15387244     DOI: 10.1109/TNN.2003.809398

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  17 in total

1.  A comprehensive descriptor of shape: method and application to content-based retrieval of similar appearing lesions in medical images.

Authors:  Jiajing Xu; Jessica Faruque; Christopher F Beaulieu; Daniel Rubin; Sandy Napel
Journal:  J Digit Imaging       Date:  2012-02       Impact factor: 4.056

2.  Deep sparse multi-task learning for feature selection in Alzheimer's disease diagnosis.

Authors:  Heung-Il Suk; Seong-Whan Lee; Dinggang Shen
Journal:  Brain Struct Funct       Date:  2015-05-21       Impact factor: 3.270

3.  Multi-Layer Multi-View Classification for Alzheimer's Disease Diagnosis.

Authors:  Changqing Zhang; Ehsan Adeli; Tao Zhou; Xiaobo Chen; Dinggang Shen
Journal:  Proc Conf AAAI Artif Intell       Date:  2018-02

4.  Feature Import Vector Machine: A General Classifier with Flexible Feature Selection.

Authors:  Samiran Ghosh; Yazhen Wang
Journal:  Stat Anal Data Min       Date:  2015-01-26       Impact factor: 1.051

5.  Local-learning-based feature selection for high-dimensional data analysis.

Authors:  Yijun Sun; Sinisa Todorovic; Steve Goodison
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2010-09       Impact factor: 6.226

6.  Variable selection with Group LASSO approach: Application to Cox regression with frailty model.

Authors:  Jean Claude Utazirubanda; Tomas Leon; Papa Ngom
Journal:  Commun Stat Simul Comput       Date:  2018-02-28       Impact factor: 1.118

7.  Prediction using step-wise L1, L2 regularization and feature selection for small data sets with large number of features.

Authors:  Ozgur Demir-Kavuk; Mayumi Kamada; Tatsuya Akutsu; Ernst-Walter Knapp
Journal:  BMC Bioinformatics       Date:  2011-10-25       Impact factor: 3.169

8.  Subclass-based multi-task learning for Alzheimer's disease diagnosis.

Authors:  Heung-Ii Suk; Seong-Whan Lee; Dinggang Shen
Journal:  Front Aging Neurosci       Date:  2014-08-07       Impact factor: 5.750

9.  A novel method incorporating gene ontology information for unsupervised clustering and feature selection.

Authors:  Shireesh Srivastava; Linxia Zhang; Rong Jin; Christina Chan
Journal:  PLoS One       Date:  2008-12-04       Impact factor: 3.240

10.  Diabetic retinopathy risk prediction for fundus examination using sparse learning: a cross-sectional study.

Authors:  Ein Oh; Tae Keun Yoo; Eun-Cheol Park
Journal:  BMC Med Inform Decis Mak       Date:  2013-09-13       Impact factor: 2.796

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.