Translator Disclaimer
August 2008 One-step sparse estimates in nonconcave penalized likelihood models
Hui Zou, Runze Li
Ann. Statist. 36(4): 1509-1533 (August 2008). DOI: 10.1214/009053607000000802

Abstract

Fan and Li propose a family of variable selection methods via penalized likelihood using concave penalty functions. The nonconcave penalized likelihood estimators enjoy the oracle properties, but maximizing the penalized likelihood function is computationally challenging, because the objective function is nondifferentiable and nonconcave. In this article, we propose a new unified algorithm based on the local linear approximation (LLA) for maximizing the penalized likelihood for a broad class of concave penalty functions. Convergence and other theoretical properties of the LLA algorithm are established. A distinguished feature of the LLA algorithm is that at each LLA step, the LLA estimator can naturally adopt a sparse representation. Thus, we suggest using the one-step LLA estimator from the LLA algorithm as the final estimates. Statistically, we show that if the regularization parameter is appropriately chosen, the one-step LLA estimates enjoy the oracle properties with good initial estimators. Computationally, the one-step LLA estimation methods dramatically reduce the computational cost in maximizing the nonconcave penalized likelihood. We conduct some Monte Carlo simulation to assess the finite sample performance of the one-step sparse estimation methods. The results are very encouraging.

Citation

Download Citation

Hui Zou. Runze Li. "One-step sparse estimates in nonconcave penalized likelihood models." Ann. Statist. 36 (4) 1509 - 1533, August 2008. https://doi.org/10.1214/009053607000000802

Information

Published: August 2008
First available in Project Euclid: 16 July 2008

zbMATH: 1142.62027
MathSciNet: MR2435443
Digital Object Identifier: 10.1214/009053607000000802

Subjects:
Primary: 62J05, 62J07

Rights: Copyright © 2008 Institute of Mathematical Statistics

JOURNAL ARTICLE
25 PAGES


SHARE
Vol.36 • No. 4 • August 2008
Back to Top