The Annals of Statistics

High-dimensional generalizations of asymmetric least squares regression and their applications

Yuwen Gu and Hui Zou

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

Asymmetric least squares regression is an important method that has wide applications in statistics, econometrics and finance. The existing work on asymmetric least squares only considers the traditional low dimension and large sample setting. In this paper, we systematically study the Sparse Asymmetric LEast Squares (SALES) regression under high dimensions where the penalty functions include the Lasso and nonconvex penalties. We develop a unified efficient algorithm for fitting SALES and establish its theoretical properties. As an important application, SALES is used to detect heteroscedasticity in high-dimensional data. Another method for detecting heteroscedasticity is the sparse quantile regression. However, both SALES and the sparse quantile regression may fail to tell which variables are important for the conditional mean and which variables are important for the conditional scale/variance, especially when there are variables that are important for both the mean and the scale. To that end, we further propose a COupled Sparse Asymmetric LEast Squares (COSALES) regression which can be efficiently solved by an algorithm similar to that for solving SALES. We establish theoretical properties of COSALES. In particular, COSALES using the SCAD penalty or MCP is shown to consistently identify the two important subsets for the mean and scale simultaneously, even when the two subsets overlap. We demonstrate the empirical performance of SALES and COSALES by simulated and real data.

Article information

Source
Ann. Statist. Volume 44, Number 6 (2016), 2661-2694.

Dates
Received: June 2015
Revised: November 2015
First available in Project Euclid: 23 November 2016

Permanent link to this document
https://projecteuclid.org/euclid.aos/1479891631

Digital Object Identifier
doi:10.1214/15-AOS1431

Mathematical Reviews number (MathSciNet)
MR3576557

Zentralblatt MATH identifier
1364.62185

Subjects
Primary: 62J07: Ridge regression; shrinkage estimators

Keywords
Asymmetric least squares COSALES high dimensions SALES

Citation

Gu, Yuwen; Zou, Hui. High-dimensional generalizations of asymmetric least squares regression and their applications. Ann. Statist. 44 (2016), no. 6, 2661--2694. doi:10.1214/15-AOS1431. https://projecteuclid.org/euclid.aos/1479891631


Export citation

References

  • Bickel, P. J., Ritov, Y. and Tsybakov, A. B. (2009). Simultaneous analysis of lasso and Dantzig selector. Ann. Statist. 37 1705–1732.
  • Candes, E. and Tao, T. (2007). The Dantzig selector: Statistical estimation when $p$ is much larger than $n$. Ann. Statist. 35 2313–2351.
  • Chambers, R. and Tzavidis, N. (2006). $M$-quantile models for small area estimation. Biometrika 93 255–268.
  • Daye, Z. J., Chen, J. and Li, H. (2012). High-dimensional heteroscedastic regression with an application to eQTL data analysis. Biometrics 68 316–326.
  • Efron, B. (1991). Regression percentiles using asymmetric squared error loss. Statist. Sinica 1 93–125.
  • Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. Ann. Statist. 32 407–499.
  • Eilers, P. H. and Boelens, H. F. (2005). Baseline correction with asymmetric least squares smoothing. Leiden Univ. Medical Centre Report.
  • Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
  • Fan, J. and Lv, J. (2011). Nonconcave penalized likelihood with NP-dimensionality. IEEE Trans. Inform. Theory 57 5467–5484.
  • Fan, J., Xue, L. and Zou, H. (2014). Strong oracle optimality of folded concave penalized estimation. Ann. Statist. 42 819–849.
  • Friedman, J., Hastie, T. and Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent. J. Stat. Softw. 33 1–22.
  • Gu, Y. and Zou, H. (2015). Supplement to “High-dimensional generalizations of asymmetric least squares regression and their applications.” DOI:10.1214/15-AOS1431SUPP.
  • Huang, J. and Zhang, C. (2012). Estimation and selection via absolute penalized convex minimization and its multistage adaptive applications. J. Mach. Learn. Res. 13 1839–1864.
  • Koenker, R. and Bassett, G. Jr. (1978). Regression quantiles. Econometrica 46 33–50.
  • Koenker, R. and Bassett, G. Jr. (1982). Robust tests for heteroscedasticity based on regression quantiles. Econometrica 50 43–61.
  • Koenker, R. and Zhao, Q. S. (1994). $L$-estimation for linear heteroscedastic models. J. Nonparametr. Stat. 3 223–235.
  • Kuan, C., Yeh, J. and Hsu, Y. (2009). Assessing value at risk with CARE, the conditional autoregressive expectile models. J. Econometrics 150 261–270.
  • Meier, L., van de Geer, S. and Bühlmann, P. (2009). High-dimensional additive modeling. Ann. Statist. 37 3779–3821.
  • Negahban, S. N., Ravikumar, P., Wainwright, M. J. and Yu, B. (2012). A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers. Statist. Sci. 27 538–557.
  • Newey, W. K. and Powell, J. L. (1987). Asymmetric least squares estimation and testing. Econometrica 55 819–847.
  • Parikh, N. and Boyd, S. (2013). Proximal algorithms. Found. Trends Optim. 1 123–231.
  • Rudelson, M. and Vershynin, R. (2013). Hanson–Wright inequality and sub-Gaussian concentration. Electron. Commun. Probab. 18 no. 82, 9.
  • Salvati, N., Tzavidis, N., Pratesi, M. and Chambers, R. (2012). Small area estimation via M-quantile geographically weighted regression. TEST 21 1–28.
  • Scheetz, T. E., Kim, K. A., Swiderski, R. E., Philp, A. R., Braun, T. A., Knudtson, K. L., Dorrance, A. M., DiBona, G. F., Huang, J., Casavant, T. L. et al. (2006). Regulation of gene expression in the mammalian eye and its relevance to eye disease. Proc. Natl. Acad. Sci. USA 103 14429–14434.
  • Taylor, J. W. (2008). Estimating value at risk and expected shortfall using expectiles. J. Financ. Econom. 6 231–252.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • Tseng, P. (2001). Convergence of a block coordinate descent method for nondifferentiable minimization. J. Optim. Theory Appl. 109 475–494.
  • Vershynin, R. (2010). Introduction to the non-asymptotic analysis of random matrices. Preprint. Available at arXiv:1011.3027v7.
  • Wang, L., Kim, Y. and Li, R. (2013). Calibrating nonconvex penalized regression in ultra-high dimension. Ann. Statist. 41 2505–2536.
  • Wang, L., Wu, Y. and Li, R. (2012). Quantile regression for analyzing heterogeneity in ultra-high dimension. J. Amer. Statist. Assoc. 107 214–222.
  • Xie, S., Zhou, Y. and Wan, A. T. K. (2014). A varying-coefficient expectile model for estimating value at risk. J. Bus. Econom. Statist. 32 576–592.
  • Yang, Y. and Zou, H. (2013). An efficient algorithm for computing the HHSVM and its generalizations. J. Comput. Graph. Statist. 22 396–415.
  • Ye, F. and Zhang, C. (2010). Rate minimaxity of the Lasso and Dantzig selector for the $\ell_{q}$ loss in $\ell_{r}$ balls. J. Mach. Learn. Res. 11 3519–3540.
  • Zhang, C. (2010). Nearly unbiased variable selection under minimax concave penalty. Ann. Statist. 38 894–942.
  • Zhao, P. and Yu, B. (2006). On model selection consistency of Lasso. J. Mach. Learn. Res. 7 2541–2563.
  • Zou, H. (2006). The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 1418–1429.
  • Zou, H. and Li, R. (2008). One-step sparse estimates in nonconcave penalized likelihood models. Ann. Statist. 36 1509–1533.

Supplemental materials

  • Supplement to “High-dimensional generalizations of asymmetric least squares regression and their applications”. The supplementary material includes the iteration complexity analysis of the SALES algorithm.