The Annals of Statistics

An adaptive composite quantile approach to dimension reduction

Efang Kong and Yingcun Xia

Full-text: Open access

Abstract

Sufficient dimension reduction [J. Amer. Statist. Assoc. 86 (1991) 316–342] has long been a prominent issue in multivariate nonparametric regression analysis. To uncover the central dimension reduction space, we propose in this paper an adaptive composite quantile approach. Compared to existing methods, (1) it requires minimal assumptions and is capable of revealing all dimension reduction directions; (2) it is robust against outliers and (3) it is structure-adaptive, thus more efficient. Asymptotic results are proved and numerical examples are provided, including a real data analysis.

Article information

Source
Ann. Statist., Volume 42, Number 4 (2014), 1657-1688.

Dates
First available in Project Euclid: 7 August 2014

Permanent link to this document
https://projecteuclid.org/euclid.aos/1407420012

Digital Object Identifier
doi:10.1214/14-AOS1242

Mathematical Reviews number (MathSciNet)
MR3262464

Zentralblatt MATH identifier
1310.62052

Subjects
Primary: 62J07: Ridge regression; shrinkage estimators

Keywords
Bahadur approximation sufficient dimension reduction local polynomial smoothing quantile regression semiparametric models U-processes

Citation

Kong, Efang; Xia, Yingcun. An adaptive composite quantile approach to dimension reduction. Ann. Statist. 42 (2014), no. 4, 1657--1688. doi:10.1214/14-AOS1242. https://projecteuclid.org/euclid.aos/1407420012


Export citation

References

  • Arcones, M. A. (1995). A Bernstein-type inequality for $U$-statistics and $U$-processes. Statist. Probab. Lett. 22 239–247.
  • Bai, Z. D., Miao, B. Q. and Rao, C. R. (1991). Estimation of directions of arrival of signals: Asymptotic results. In Advances in Spectrum Analysis and Array Processing (S. Haykin, ed.) II 327–347. Prentice Hall, Upper Saddle River, NJ.
  • Bhattacharya, P. K. and Gangopadhyay, A. K. (1990). Kernel and nearest-neighbor estimation of a conditional quantile. Ann. Statist. 18 1400–1415.
  • Chaudhuri, P. (1991). Global nonparametric estimation of conditional quantile functions and their derivatives. J. Multivariate Anal. 39 246–269.
  • Chaudhuri, P., Doksum, K. and Samarov, A. (1997). On average derivative quantile regression. Ann. Statist. 25 715–744.
  • Cook, R. D. (1994). Using dimension-reduction subspaces to identify important inputs in models of physical systems. In Proceedings of the Section on Physical and Engineering Sciences 18–25. Amer. Statist. Assoc., Alexandria, VA.
  • Cook, R. D. (1998). Regression Graphics. Wiley, New York.
  • Cook, R. D. (2007). Fisher lecture: Dimension reduction in regression. Statist. Sci. 22 1–26.
  • Cook, R. D. and Li, B. (2002). Dimension reduction for conditional mean in regression. Ann. Statist. 30 455–474.
  • Fukumizu, K., Bach, F. R. and Jordan, M. I. (2009). Kernel dimension reduction in regression. Ann. Statist. 37 1871–1905.
  • He, X., Wang, L. and Hong, H. G. (2013). Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data. Ann. Statist. 41 342–369.
  • Hristache, M., Juditsky, A., Polzehl, J. and Spokoiny, V. (2001). Structure adaptive approach for dimension reduction. Ann. Statist. 29 1537–1566.
  • Kai, B., Li, R. and Zou, H. (2010). Local composite quantile regression smoothing: An efficient and safe alternative to local polynomial regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 72 49–69.
  • Kato, T. (1995). Perturbation Theory for Linear Operators. Springer, Berlin.
  • Koenker, R. and Bassett, G. Jr. (1978). Regression quantiles. Econometrica 46 33–50.
  • Koenker, R. and Machado, J. A. F. (1999). Goodness of fit and related inference processes for quantile regression. J. Amer. Statist. Assoc. 94 1296–1310.
  • Koenker, R., Ng, P. and Portnoy, S. (1994). Quantile smoothing splines. Biometrika 81 673–680.
  • Koenker, R., Portnoy, S. and Ng, P. (1992). Nonparametric estimation of conditional quantile functions. In $L_1$-statistical Analysis and Related Methods (Neuchâtel, 1992) (Y. Dodge, ed.) 217–229. North-Holland, Amsterdam.
  • Kong, E., Linton, O. and Xia, Y. (2010). Uniform Bahadur representation for local polynomial estimates of $M$-regression and its application to the additive model. Econometric Theory 26 1529–1564.
  • Kong, E., Linton, O. and Xia, Y. (2013). Global Bahadur representation for nonparametric censored regression quantiles and its applications. Econometric Theory 29 941–968.
  • Li, K.-C. (1991). Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86 316–342.
  • Li, B., Cook, R. D. and Chiaromonte, F. (2003). Dimension reduction for the conditional mean in regressions with categorical predictors. Ann. Statist. 31 1636–1668.
  • Li, B., Zha, H. and Chiaromonte, F. (2005). Contour regression: A general approach to dimension reduction. Ann. Statist. 33 1580–1616.
  • Lue, H.-H. (2004). Principal Hessian directions for regression with measurement error. Biometrika 91 409–423.
  • Ma, Y. and Zhu, L. (2012). A semiparametric approach to dimension reduction. J. Amer. Statist. Assoc. 107 168–179.
  • Masry, E. (1996). Multivariate local polynomial regression for time series: Uniform strong consistency and rates. J. Time Series Anal. 17 571–599.
  • Nolan, D. and Pollard, D. (1987). $U$-processes: Rates of convergence. Ann. Statist. 15 780–799.
  • Pakes, A. and Pollard, D. (1989). Simulation and the asymptotics of optimization estimators. Econometrica 57 1027–1057.
  • Pollard, D. (1984). Convergence of Stochastic Processes. Springer, New York.
  • Sun, S. G. (1988). Analytic expressions for the derivatives of the eigenvalues and eigenvectors of a matrix. Adv. in Math. (Beijing) 17 391–397.
  • Truong, Y. K. (1989). Asymptotic properties of kernel estimators based on local medians. Ann. Statist. 17 606–617.
  • van der Vaart, A. W. and Wellner, J. A. (1996). Weak Convergence and Empirical Processes: With Applications to Statistics. Springer, New York.
  • Wang, H. and Xia, Y. (2008). Sliced regression for dimension reduction. J. Amer. Statist. Assoc. 103 811–821.
  • Xia, Y. (2007). A constructive approach to the estimation of dimension reduction directions. Ann. Statist. 35 2654–2690.
  • Xia, Y., Tong, H., Li, W. K. and Zhu, L.-X. (2002). An adaptive estimation of dimension reduction space. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 363–410.
  • Yin, X. and Cook, R. D. (2002). Dimension reduction for the conditional $k$th moment in regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 159–175.
  • Yin, X. and Li, B. (2011). Sufficient dimension reduction based on an ensemble of minimum average variance estimators. Ann. Statist. 39 3392–3416.
  • Yin, X., Li, B. and Cook, R. D. (2008). Successive direction extraction for estimating the central subspace in a multiple-index regression. J. Multivariate Anal. 99 1733–1757.
  • Yu, K. and Jones, M. C. (1998). Local linear quantile regression. J. Amer. Statist. Assoc. 93 228–237.
  • Zhu, L.-X. and Fang, K.-T. (1996). Asymptotics for kernel estimate of sliced inverse regression. Ann. Statist. 24 1053–1068.
  • Zhu, Y. and Zeng, P. (2006). Fourier methods for estimating the central subspace and the central mean subspace in regression. J. Amer. Statist. Assoc. 101 1638–1651.
  • Zhu, L.-P. and Zhu, L.-X. (2009). Dimension reduction for conditional variance in regressions. Statist. Sinica 19 869–883.
  • Zhu, L.-P., Zhu, L.-X. and Feng, Z.-H. (2010). Dimension reduction in regressions through cumulative slicing estimation. J. Amer. Statist. Assoc. 105 1455–1466.
  • Zou, H. and Yuan, M. (2008). Composite quantile regression and the oracle model selection theory. Ann. Statist. 36 1108–1126.