The Annals of Statistics

Determining the dimension of iterative Hessian transformation

R. Dennis Cook and Bing Li

Full-text: Open access


The central mean subspace (CMS) and iterative Hessian transformation (IHT) have been introduced recently for dimension reduction when the conditional mean is of interest. Suppose that X is a vector-valued predictor and Y is a scalar response. The basic problem is to find a lower-dimensional predictor ηTX such that E(Y|X)=E(YTX). The CMS defines the inferential object for this problem and IHT provides an estimating procedure. Compared with other methods, IHT requires fewer assumptions and has been shown to perform well when the additional assumptions required by those methods fail. In this paper we give an asymptotic analysis of IHT and provide stepwise asymptotic hypothesis tests to determine the dimension of the CMS, as estimated by IHT. Here, the original IHT method has been modified to be invariant under location and scale transformations. To provide empirical support for our asymptotic results, we will present a series of simulation studies. These agree well with the theory. The method is applied to analyze an ozone data set.

Article information

Ann. Statist. Volume 32, Number 6 (2004), 2501-2531.

First available in Project Euclid: 7 February 2005

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G08: Nonparametric regression
Secondary: 62G09: Resampling methods 62H05: Characterization and structure theory

Dimension reduction conditional mean asymptotic test order determination eigenvalues


Cook, R. Dennis; Li, Bing. Determining the dimension of iterative Hessian transformation. Ann. Statist. 32 (2004), no. 6, 2501--2531. doi:10.1214/009053604000000661.

Export citation


  • Breiman, L. and Friedman, J. (1985). Estimating optimal transformations for multiple regression and correlation (with discussion). J. Amer. Statist. Assoc. 80 580--619.
  • Bura, E. and Cook, R. D. (2001). Estimating the structural dimension of regressions via parametric inverse regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 393--410.
  • Clark, R. G., Henderson, H. V., Hoggard, G. K., Ellison, R. S. and Young, B. J. (1987). The ability of biochemical and haematological tests to predict recovery in periparturient recumbent cows. New Zealand Veterinary J. 35 126--133.
  • Cook, R. D. (1998a). Regression Graphics. Wiley, New York.
  • Cook, R. D. (1998b). Principal Hessian directions revisited (with discussion). J. Amer. Statist. Assoc. 93 84--100.
  • Cook, R. D. and Critchley, F. (2000). Identifying regression outliers and mixtures graphically. J. Amer. Statist. Assoc. 95 781--794.
  • Cook, R. D. and Li, B. (2002). Dimension reduction for the conditional mean in regression. Ann. Statist. 30 455--474.
  • Cook, R. D. and Nachtsheim, C. J. (1994). Reweighting to achieve elliptically contoured covariates in regression. J. Amer. Statist. Assoc. 89 592--599.
  • Cook, R. D. and Weisberg, S. (1991). Discussion of ``Sliced inverse regression for dimension reduction,'' by K. C. Li. J. Amer. Statist. Assoc. 86 328--332.
  • Cook, R. D. and Weisberg, S. (1999). Applied Regression Including Computing and Graphics. Wiley, New York.
  • Diaconis, P. and Freedman, D. (1984). Asymptotics of graphical projection pursuit. Ann. Statist. 12 793--815.
  • Eaton, M. L. (1986). A characterization of spherical distributions. J. Multivariate Anal. 20 272--276.
  • Eaton, M. L. and Tyler, D. E. (1994). The asymptotic distribution of singular values with applications to canonical correlations and correspondence analysis. J. Multivariate Anal. 50 238--264.
  • Field, C. (1993). Tail areas of linear combinations of chi-squares and non-central chi-squares. J. Statist. Comput. Simulation 45 243--248.
  • Gather, U., Hilker, T. and Becker, C. (2001). A robustified version of sliced inverse regression. In Statistics in Genetics and in the Environmental Sciences (L. T. Fernholz, S. Morgenthaler and W. Stahel, eds.) 147--157. Birkhäuser, Basel.
  • Gather, U., Hilker, T. and Becker, C. (2002). A note on outlier sensitivity of sliced inverse regression. Statistics 36 271--281.
  • Hall, P. and Li, K. C. (1993). On almost linearity of low-dimensional projections from high-dimensional data. Ann. Statist. 21 867--889.
  • Li, B., Cook, R. D. and Chiaromonte, F. (2003). Dimension reduction for the conditional mean in regressions with categorical predictors. Ann. Statist. 31 1636--1668.
  • Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316--342.
  • Li, K. C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein's lemma. J. Amer. Statist. Assoc. 87 1025--1039.
  • Li, K. C. and Duan, N. (1989). Regression analysis under link violation. Ann. Statist. 17 1009--1052.
  • Rao, C. R. (1965). Linear Statistical Inference and Its Applications. Wiley, New York.
  • Schott, J. (1994). Determining the dimensionality in sliced inverse regression. J. Amer. Statist. Assoc. 89 141--148.
  • Yin, X. and Cook, R. D. (2002). Dimension reduction for the conditional $k$th moment in regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 159--175.