The Annals of Statistics

On the adaptive elastic-net with a diverging number of parameters

Hui Zou and Hao Helen Zhang

Full-text: Open access

Abstract

We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property [J. Amer. Statist. Assoc. 96 (2001) 1348–1360] and [Ann. Statist. 32 (2004) 928–961] which ensures the optimal large sample performance. Furthermore, the high-dimensionality often induces the collinearity problem, which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously. In this paper, we propose the adaptive elastic-net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive elastic-net. We show by simulations that the adaptive elastic-net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.

Article information

Source
Ann. Statist., Volume 37, Number 4 (2009), 1733-1751.

Dates
First available in Project Euclid: 18 June 2009

Permanent link to this document
https://projecteuclid.org/euclid.aos/1245332831

Digital Object Identifier
doi:10.1214/08-AOS625

Mathematical Reviews number (MathSciNet)
MR2533470

Zentralblatt MATH identifier
1168.62064

Subjects
Primary: 62J05: Linear regression
Secondary: 62J07: Ridge regression; shrinkage estimators

Keywords
Adaptive regularization elastic-net high dimensionality model selection oracle property shrinkage methods

Citation

Zou, Hui; Zhang, Hao Helen. On the adaptive elastic-net with a diverging number of parameters. Ann. Statist. 37 (2009), no. 4, 1733--1751. doi:10.1214/08-AOS625. https://projecteuclid.org/euclid.aos/1245332831


Export citation

References