Literature DB >> 23868773

Efficient methods for overlapping group lasso.

Lei Yuan1, Jun Liu, Jieping Ye.   

Abstract

The group Lasso is an extension of the Lasso for feature selection on (predefined) nonoverlapping groups of features. The nonoverlapping group structure limits its applicability in practice. There have been several recent attempts to study a more general formulation where groups of features are given, potentially with overlaps between the groups. The resulting optimization is, however, much more challenging to solve due to the group overlaps. In this paper, we consider the efficient optimization of the overlapping group Lasso penalized problem. We reveal several key properties of the proximal operator associated with the overlapping group Lasso, and compute the proximal operator by solving the smooth and convex dual problem, which allows the use of the gradient descent type of algorithms for the optimization. Our methods and theoretical results are then generalized to tackle the general overlapping group Lasso formulation based on the l(q) norm. We further extend our algorithm to solve a nonconvex overlapping group Lasso formulation based on the capped norm regularization, which reduces the estimation bias introduced by the convex penalty. We have performed empirical evaluations using both a synthetic and the breast cancer gene expression dataset, which consists of 8,141 genes organized into (overlapping) gene sets. Experimental results show that the proposed algorithm is more efficient than existing state-of-the-art algorithms. Results also demonstrate the effectiveness of the nonconvex formulation for overlapping group Lasso.

Entities:  

Mesh:

Year:  2013        PMID: 23868773     DOI: 10.1109/TPAMI.2013.17

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  5 in total

1.  Group spike-and-slab lasso generalized linear models for disease prediction and associated genes detection by incorporating pathway information.

Authors:  Zaixiang Tang; Yueping Shen; Yan Li; Xinyan Zhang; Jia Wen; Chen'ao Qian; Wenzhuo Zhuang; Xinghua Shi; Nengjun Yi
Journal:  Bioinformatics       Date:  2018-03-15       Impact factor: 6.937

2.  Fused Group Lasso Regularized Multi-Task Feature Learning and Its Application to the Cognitive Performance Prediction of Alzheimer's Disease.

Authors:  Xiaoli Liu; Peng Cao; Jianzhong Wang; Jun Kong; Dazhe Zhao
Journal:  Neuroinformatics       Date:  2019-04

3.  Group Guided Fused Laplacian Sparse Group Lasso for Modeling Alzheimer's Disease Progression.

Authors:  Xiaoli Liu; Jianzhong Wang; Fulong Ren; Jun Kong
Journal:  Comput Math Methods Med       Date:  2020-02-20       Impact factor: 2.238

4.  Leveraging pleiotropic association using sparse group variable selection in genomics data.

Authors:  Matthew Sutton; Pierre-Emmanuel Sugier; Therese Truong; Benoit Liquet
Journal:  BMC Med Res Methodol       Date:  2022-01-07       Impact factor: 4.615

5.  Multi-task learning sparse group lasso: a method for quantifying antigenicity of influenza A(H1N1) virus using mutations and variations in glycosylation of Hemagglutinin.

Authors:  Lei Li; Deborah Chang; Lei Han; Xiaojian Zhang; Joseph Zaia; Xiu-Feng Wan
Journal:  BMC Bioinformatics       Date:  2020-05-11       Impact factor: 3.169

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.