| Literature DB >> 25309048 |
Abstract
Recent studies have demonstrated theoretical attractiveness of a class of concave penalties in variable selectionpan>, including the smoothly clipped absolute deviationpan> and minimax conpan>cave penalties. The computationpan> of the conpan>cave penalized solutionpan>s in high-dimensionpan>al models, however, is a difficult task. We propose a majorizationpan> minimizationpan> by coordinate descent (MMCD) algorithm for computing the concave penalized solutions in generalized linear models. In contrast to the existing algorithms that use local quadratic or local linear approximation to the penalty function, the MMCD seeks to majorize the negative log-likelihood by a quadratic loss, but does not use any approximation to the penalty. This strategy makes it possible to avoid the computation of a scaling factor in each update of the solutions, which improves the efficiency of coordinate descent. Under certain regularity conditions, we establish theoretical convergence property of the MMCD. We implement this algorithm for a penalized logistic regression model using the SCAD and MCP penalties. Simulation studies and a data example demonstrate that the MMCD works sufficiently fast for the penalized logistic regression in high-dimensional settings where the number of covariates is much larger than the sample size.Entities:
Keywords: logistic regression; minimax concave penalty; p ≫ n models; smoothly clipped absolute deviation penalty; variable selection
Year: 2014 PMID: 25309048 PMCID: PMC4191872 DOI: 10.1007/s11222-013-9407-3
Source DB: PubMed Journal: Stat Comput ISSN: 0960-3174 Impact factor: 2.559