The Annals of Statistics

Composite quantile regression and the oracle model selection theory

Hui Zou and Ming Yuan

Full-text: Open access


Coefficient estimation and variable selection in multiple linear regression is routinely done in the (penalized) least squares (LS) framework. The concept of model selection oracle introduced by Fan and Li [J. Amer. Statist. Assoc. 96 (2001) 1348–1360] characterizes the optimal behavior of a model selection procedure. However, the least-squares oracle theory breaks down if the error variance is infinite. In the current paper we propose a new regression method called composite quantile regression (CQR). We show that the oracle model selection theory using the CQR oracle works beautifully even when the error variance is infinite. We develop a new oracular procedure to achieve the optimal properties of the CQR oracle. When the error variance is finite, CQR still enjoys great advantages in terms of estimation efficiency. We show that the relative efficiency of CQR compared to the least squares is greater than 70% regardless the error distribution. Moreover, CQR could be much more efficient and sometimes arbitrarily more efficient than the least squares. The same conclusions hold when comparing a CQR-oracular estimator with a LS-oracular estimator.

Article information

Ann. Statist., Volume 36, Number 3 (2008), 1108-1126.

First available in Project Euclid: 26 May 2008

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62J05: Linear regression
Secondary: 62J07: Ridge regression; shrinkage estimators

Asymptotic efficiency linear program model selection oracle properties universal lower bound


Zou, Hui; Yuan, Ming. Composite quantile regression and the oracle model selection theory. Ann. Statist. 36 (2008), no. 3, 1108--1126. doi:10.1214/07-AOS507.

Export citation


  • Breiman, L. (1995). Better subset regression using the nonnegative garrote. Technometrics 37 373–384.
  • Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
  • Fan, J. and Li, R. (2006). Statistical challenges with high dimensionality: Feature selection in knowledge discovery. Proceedings of the Madrid International Congress of Mathematicians 2006 III 595–622. EMS, Zurich.
  • Feller, W. (1968). An Introduction to Probability Theory and Its Applications. 1, 3rd ed. Wiley, New York.
  • Knight, K. (1998). Limiting distributions for l1 regression estimators under general conditions. Ann. Statist. 26 755–770.
  • Koenker, R. (2005). Quantile Regression. Cambridge Univ. Press.
  • Koenker, R. and Geling, R. (2001). Reappraising medfly longevity: A quantile regression survival analysis. J. Amer. Statist. Assoc. 96 458–468.
  • Koenker, R. and Hallock, K. (2001). Quantile regression. J. Economic Perspectives 15 143–156.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • Zou, H. (2006). The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 1418–1429.
  • Zou, H. and Yuan, M. (2007). Composite quantile regression and the oracle model selection theory. Technical report, Univ. Minnesota.