Abstract
Coefficient estimation and variable selection in multiple linear regression is routinely done in the (penalized) least squares (LS) framework. The concept of model selection oracle introduced by Fan and Li [J. Amer. Statist. Assoc. 96 (2001) 1348–1360] characterizes the optimal behavior of a model selection procedure. However, the least-squares oracle theory breaks down if the error variance is infinite. In the current paper we propose a new regression method called composite quantile regression (CQR). We show that the oracle model selection theory using the CQR oracle works beautifully even when the error variance is infinite. We develop a new oracular procedure to achieve the optimal properties of the CQR oracle. When the error variance is finite, CQR still enjoys great advantages in terms of estimation efficiency. We show that the relative efficiency of CQR compared to the least squares is greater than 70% regardless the error distribution. Moreover, CQR could be much more efficient and sometimes arbitrarily more efficient than the least squares. The same conclusions hold when comparing a CQR-oracular estimator with a LS-oracular estimator.
Citation
Hui Zou. Ming Yuan. "Composite quantile regression and the oracle model selection theory." Ann. Statist. 36 (3) 1108 - 1126, June 2008. https://doi.org/10.1214/07-AOS507
Information