Open Access
2017 Prediction weighted maximum frequency selection
Hongmei Liu, J. Sunil Rao
Electron. J. Statist. 11(1): 640-681 (2017). DOI: 10.1214/17-EJS1240

Abstract

Shrinkage estimators that possess the ability to produce sparse solutions have become increasingly important to the analysis of today’s complex datasets. Examples include the LASSO, the Elastic-Net and their adaptive counterparts. Estimation of penalty parameters still presents difficulties however. While variable selection consistent procedures have been developed, their finite sample performance can often be less than satisfactory. We develop a new strategy for variable selection using the adaptive LASSO and adaptive Elastic-Net estimators with $p_{n}$ diverging. The basic idea first involves using the trace paths of their LARS solutions to bootstrap estimates of maximum frequency (MF) models conditioned on dimension. Conditioning on dimension effectively mitigates overfitting, however to deal with underfitting, these MFs are then prediction-weighted, and it is shown that not only can consistent model selection be achieved, but that attractive convergence rates can as well, leading to excellent finite sample performance. Detailed numerical studies are carried out on both simulated and real datasets. Extensions to the class of generalized linear models are also detailed.

Citation

Download Citation

Hongmei Liu. J. Sunil Rao. "Prediction weighted maximum frequency selection." Electron. J. Statist. 11 (1) 640 - 681, 2017. https://doi.org/10.1214/17-EJS1240

Information

Received: 1 March 2016; Published: 2017
First available in Project Euclid: 3 March 2017

zbMATH: 1359.62298
MathSciNet: MR3619319
Digital Object Identifier: 10.1214/17-EJS1240

Subjects:
Primary: 62J07

Keywords: adaptive Elastic-Net , Adaptive LASSO , Bootstrapping , Model selection

Vol.11 • No. 1 • 2017
Back to Top