The Annals of Statistics

Testing predictor contributions in sufficient dimension reduction

R. Dennis Cook

Full-text: Open access

Abstract

We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower-dimensional version without loss of information on the regression. Methodology using sliced inverse regression is developed in detail.

Article information

Source
Ann. Statist. Volume 32, Number 3 (2004), 1062-1092.

Dates
First available in Project Euclid: 24 May 2004

Permanent link to this document
http://projecteuclid.org/euclid.aos/1085408495

Digital Object Identifier
doi:10.1214/009053604000000292

Mathematical Reviews number (MathSciNet)
MR2065198

Zentralblatt MATH identifier
02100793

Subjects
Primary: 62G08: Nonparametric regression
Secondary: 62G09: Resampling methods 62H05: Characterization and structure theory

Keywords
Central subspace nonparametric regression sliced inverse regression

Citation

Cook, R. Dennis. Testing predictor contributions in sufficient dimension reduction. Ann. Statist. 32 (2004), no. 3, 1062--1092. doi:10.1214/009053604000000292. http://projecteuclid.org/euclid.aos/1085408495.


Export citation

References

  • Bura, E. and Cook, R. D. (2001a). Extending sliced inverse regression: The weighted chi-squared test. J. Amer. Statist. Assoc. 96 996--1003.
  • Bura, E. and Cook, R. D. (2001b). Estimating the structural dimension of regressions via parametric inverse regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 393--410.
  • Chen, C.-H. and Li, K. C. (1998). Can SIR be as popular as multiple linear regression? Statist. Sinica 8 289--316.
  • Chiaromonte, F., Cook R. D. and Li, B. (2002). Sufficient dimension reduction in regressions with categorical predictors. Ann. Statist. 30 475--497.
  • Cook, R. D. (1994). On the interpretation of regression plots. J. Amer. Statist. Assoc. 89 177--189.
  • Cook, R. D. (1996). Graphics for regressions with a binary response. J. Amer. Statist. Assoc. 91 983--992.
  • Cook, R. D. (1998a). Regression Graphics. Wiley, New York.
  • Cook, R. D. (1998b). Principal Hessian directions revisited (with discussion). J. Amer. Statist. Assoc. 93 84--100.
  • Cook, R. D. and Critchley, F. (2000). Identifying regression outliers and mixtures graphically. J. Amer. Statist. Assoc. 95 781--794.
  • Cook, R. D. and Lee, H. (1999). Dimension reduction in binary response regression. J. Amer. Statist. Assoc. 94 1187--1200.
  • Cook, R. D. and Li, B. (2002). Dimension reduction for conditional mean in regression. Ann. Statist. 30 455--474.
  • Cook, R. D. and Nachtsheim, C. J. (1994). Reweighing to achieve elliptically contoured covariates in regression. J. Amer. Statist. Assoc. 89 592--599.
  • Cook, R. D. and Weisberg, S. (1991). Discussion of ``Sliced inverse regression for dimension reduction,'' by K.-C. Li. J. Amer. Statist. Assoc. 86 328--332.
  • Cook, R. D. and Weisberg, S. (1999a). Graphs in statistical analysis: Is the medium the message? Amer. Statist. 53 29--37.
  • Cook, R. D. and Weisberg, S. (1999b). Applied Regression Including Computing and Graphics. Wiley, New York.
  • Eaton, M. L. and Tyler, D. E. (1994). The asymptotic distribution of singular values with applications to canonical correlations and correspondence analysis. J. Multivariate Anal. 50 238--264.
  • Field, C. (1993). Tail areas of linear combinations of chi-squares and non-central chi-squares. J. Statist. Comput. Simulation 45 243--248.
  • Gather, U., Hilker, T. and Becker, C. (2001). A robustified version of sliced inverse regression. In Statistics in Genetics and in the Environmental Sciences (L. T. Fernholz, S. Morganthaler and W. Stahel, eds.) 145--157. Birkhäuser, Basel.
  • Hall, P. and Li, K. C. (1993). On almost linearity of low dimensional projections from high dimensional data. Ann. Statist. 21 867--889.
  • Hsing, T. and Carroll, R. J. (1992). An asymptotic theory for sliced inverse regression. Ann. Statist. 20 1040--1061.
  • Li, B., Cook, R. D. and Chiaromonte, F. (2003). Dimension reduction for the conditional mean in regressions with categorical predictors. Ann. Statist. 31 1636--1668.
  • Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316--342.
  • Li, K. C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein's lemma. J. Amer. Statist. Assoc. 87 1025--1039.
  • Li, K. C. (1997). Nonlinear confounding in high-dimensional regression. Ann. Statist. 25 577--612.
  • Muirhead, R. J. (1982). Aspects of Multivariate Statistical Theory. Wiley, New York.
  • Peters, B. C., Redner, R. and Decell, H. P. (1978). Characterizations of linear sufficient statistics. Sankhyā Ser. A 40 303--309.
  • Rao, C. R. (1965). Linear Statistical Inference and Its Applications. Wiley, New York.
  • Schott, J. (1994). Determining the dimensionality in sliced inverse regression. J. Amer. Statist. Assoc. 89 141--148.
  • Xia, Y., Tong, H., Li, W. K. and Zhu, L.-X. (2002). An adaptive estimation of dimension reduction space. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 363--410.
  • Zhu, L.-X. and Fang, K.-T. (1996). Asymptotics for kernel estimate of sliced inverse regression. Ann. Statist. 24 1053--1068.
  • Zhu, L.-X. and Ng, K. W. (1995). Asymptotics of sliced inverse regression. Statist. Sinica 5 727--736.