Translator Disclaimer
August 2011 The sparse Laplacian shrinkage estimator for high-dimensional regression
Jian Huang, Shuangge Ma, Hongzhe Li, Cun-Hui Zhang
Ann. Statist. 39(4): 2021-2046 (August 2011). DOI: 10.1214/11-AOS897

Abstract

We propose a new penalized method for variable selection and estimation that explicitly incorporates the correlation patterns among predictors. This method is based on a combination of the minimax concave penalty and Laplacian quadratic associated with a graph as the penalty function. We call it the sparse Laplacian shrinkage (SLS) method. The SLS uses the minimax concave penalty for encouraging sparsity and Laplacian quadratic penalty for promoting smoothness among coefficients associated with the correlated predictors. The SLS has a generalized grouping property with respect to the graph represented by the Laplacian quadratic. We show that the SLS possesses an oracle property in the sense that it is selection consistent and equal to the oracle Laplacian shrinkage estimator with high probability. This result holds in sparse, high-dimensional settings with pn under reasonable conditions. We derive a coordinate descent algorithm for computing the SLS estimates. Simulation studies are conducted to evaluate the performance of the SLS method and a real data example is used to illustrate its application.

Citation

Download Citation

Jian Huang. Shuangge Ma. Hongzhe Li. Cun-Hui Zhang. "The sparse Laplacian shrinkage estimator for high-dimensional regression." Ann. Statist. 39 (4) 2021 - 2046, August 2011. https://doi.org/10.1214/11-AOS897

Information

Published: August 2011
First available in Project Euclid: 24 August 2011

zbMATH: 1227.62049
MathSciNet: MR2893860
Digital Object Identifier: 10.1214/11-AOS897

Subjects:
Primary: 62J05, 62J07
Secondary: 60F12, 62H20

Rights: Copyright © 2011 Institute of Mathematical Statistics

JOURNAL ARTICLE
26 PAGES


SHARE
Vol.39 • No. 4 • August 2011
Back to Top