The Annals of Statistics

Contour projected dimension reduction

Ronghua Luo, Hansheng Wang, and Chih-Ling Tsai

Full-text: Open access


In regression analysis, we employ contour projection (CP) to develop a new dimension reduction theory. Accordingly, we introduce the notions of the central contour subspace and generalized contour subspace. We show that both of their structural dimensions are no larger than that of the central subspace Cook [Regression Graphics (1998b) Wiley]. Furthermore, we employ CP-sliced inverse regression, CP-sliced average variance estimation and CP-directional regression to estimate the generalized contour subspace, and we subsequently obtain their theoretical properties. Monte Carlo studies demonstrate that the three CP-based dimension reduction methods outperform their corresponding non-CP approaches when the predictors have heavy-tailed elliptical distributions. An empirical example is also presented to illustrate the usefulness of the CP method.

Article information

Ann. Statist. Volume 37, Number 6B (2009), 3743-3778.

First available in Project Euclid: 23 October 2009

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G08: Nonparametric regression
Secondary: 62G35: Robustness 62G20: Asymptotic properties

Central subspace central contour subspace contour projection directional regression generalized contour subspace kernel contour subspace $\sqrt{n}$-consistency sliced average variance estimation sliced inverse regression sufficient contour subspace


Luo, Ronghua; Wang, Hansheng; Tsai, Chih-Ling. Contour projected dimension reduction. Ann. Statist. 37 (2009), no. 6B, 3743--3778. doi:10.1214/08-AOS679.

Export citation


  • Chiaromonte, F., Cook, R. D. and Li, B. (2002). Partial dimension reduction with categorical predictors. Ann. Statist. 30 475–497.
  • Cook, R. D. (1994). On interpretation of regression plots. J. Amer. Statist. Assoc. 89 177–189.
  • Cook, R. D. (1996). Graphics for regression with a binary response. J. Amer. Statist. Assoc. 91 983–992.
  • Cook, R. D. (1998a). Principal Hessian directions revisited. J. Amer. Statist. Assoc. 93 84–94.
  • Cook, R. D. (1998b). Regression Graphics. Wiley, New York.
  • Cook, R. D. and Lee, H. (1999). Dimension reduction in binary response regression. J. Amer. Statist. Assoc. 94 1187–1200.
  • Cook, R. D. and Nachtsheim, C. J. (1994). Reweighting to achieve elliptically contoured covariates in regression. J. Amer. Statist. Assoc. 89 592–599.
  • Cook, R. D. and Ni, L. (2005). Sufficient dimension reduction via inverse regression: A minimum discrepancy approach. J. Amer. Statist. Assoc. 100 410–428.
  • Cook, R. D. and Setodji, M. (2003). A model free test for reduced rank in multivariate regression. J. Amer. Statist. Assoc. 98 340–351.
  • Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction.” J. Amer. Statist. Assoc. 86 28–33.
  • Eaton, M. L. (1986). A characterization of spherical distribution. J. Multivariate Anal. 20 272–276.
  • Eaton, M. L. and Tyler, D. (1994). The asymptotic distribution of singular values with applications to canonical correlations and correspondence analysis. J. Multivariate Anal. 50 238–264.
  • Jiang, G. and Wang, H. (2008). Should earnings thresholds be used as delisting criteria in stock market? J. Accounting and Public Policy 27 409–419.
  • Lange, K. L., Little, R. J. A. and Taylor, J. M. G. (1989). Robust statistical modeling using t distribution. J. Amer. Statist. Assoc. 84 881–896.
  • Li, K.-C. (1991). Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86 316–327.
  • Li, K.-C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Amer. Statist. Assoc. 87 1025–1039.
  • Li, K.-C., Aragon, Y., Shedden, K. and Agnan, C. T. (2003). Dimension reduction for multivariate response data. J. Amer. Statist. Assoc. 98 99–109.
  • Li, K.-C. and Duan, N. (1989). Regression analysis under link violation. Ann. Statist. 17 1009–1052.
  • Li, B. and Wang, S. (2007). On directional regression for dimension reduction. J. Amer. Statist. Assoc. 102 997–1008.
  • Li, B., Wen, S. and Zhu, L. (2008). On a projective re-sampling method for dimension reduction with multivariate responses. J. Amer. Statist. Assoc. 103 1177–1186.
  • Li, B., Zha, H. and Chiaromonte, F. (2005). Contour regression: A general approach to dimension reduction. Ann. Statist. 33 1580–1616.
  • Li, Y. X. and Zhu, L. (2007). Asymptotics for sliced average variance estimation. Ann. Statist. 35 41–69.
  • Muirhead, R. J. (1982). Aspects of Multivariate Statistical Theory. Wiley, New York.
  • Ni, L., Cook, R. D. and Tsai, C. L. (2005). A note on shrinkage sliced inverse regression. Biometrika 92 242–247.
  • Shao, Y., Cook, R. D. and Weisberg, S. (2007). Marginal test with sliced average variance estimation. Biometrika 94 285–296.
  • Tyler, D. E. (1987). A distribution-free M-estimator of multivariate scatter. Ann. Statist. 15 234–251.
  • Wang, H., Ni, L. and Tsai, C. L. (2008). Improving dimension reduction via contour projection. Statist. Sinica 18 299–311.
  • Wang, H. and Xia, Y. (2008). Sliced regression for dimension reduction. J. Amer. Statist. Assoc. 103 811–821.
  • Xia, Y. (2007). A constructive approach to the estimation of dimension reduction directions. Ann. Statist. 35 2654–2690.
  • Xia, Y., Tong, H., Li, W. K. and Zhu, L. (2002). An adaptive estimation of dimension reduction space. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 363–410.
  • Ye, Z. and Weiss, R. E. (2003). Using the bootstrap to selection one of a new class of dimension reduction methods. J. Amer. Statist. Assoc. 98 968–979.
  • Yin, X. and Cook, R. D. (2002). Dimension reduction for the conditional kth moment in regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 159–175.
  • Yin, X. and Cook, R. D. (2003). Estimating central subspace via inverse third moment. Biometrika 90 113–125.
  • Yin, X. and Cook, R. D. (2004). Dimension reduction via marginal fourth moments in regression. J. Computat. Graph. Statist. 13 554–570.
  • Zeng, P. and Zhu, Y. (2008). An integral transformation method for estimating the central mean and central subspaces. Unpublished manuscript.
  • Zhu, L. and Fang, K. T. (1996). Asymptotics for kernel estimates of sliced inverse regression. Ann. Statist. 24 1053–1068.
  • Zhu, Y. and Zeng, P. (2006). Fourier methods for estimating the central subspace and the central mean subspace in regression. J. Amer. Statist. Assoc. 101 1638–1651.