| Literature DB >> 23946545 |
Abstract
Several sparseness penalties have been suggested for delivery of good predictive performance in automatic variable selection within the framework of regularization. All assume that the true model is sparse. We propose a penalty, a convex combination of the L1- and L∞-norms, that adapts to a variety of situations including sparseness and nonsparseness, grouping and nongrouping. The proposed penalty performs grouping and adaptive regularization. In addition, we introduce a novel homotopy algorithm utilizing subgradients for developing regularization solution surfaces involving multiple regularizers. This permits efficient computation and adaptive tuning. Numerical experiments are conducted using simulation. In simulated and real examples, the proposed penalty compares well against popular alternatives.Keywords: Homotopy; L1-norm; Lasso; L∞-norm; Subgradient; Support vector machine; Variable grouping and selection
Year: 2009 PMID: 23946545 PMCID: PMC3741328 DOI: 10.1093/biomet/asp038
Source DB: PubMed Journal: Biometrika ISSN: 0006-3444 Impact factor: 2.445