The Annals of Statistics

On surrogate dimension reduction for measurement error regression: An invariance law

Bing Li and Xiangrong Yin

Full-text: Open access


We consider a general nonlinear regression problem where the predictors contain measurement error. It has been recently discovered that several well-known dimension reduction methods, such as OLS, SIR and pHd, can be performed on the surrogate regression problem to produce consistent estimates for the original regression problem involving the unobserved true predictor. In this paper we establish a general invariance law between the surrogate and the original dimension reduction spaces, which implies that, at least at the population level, the two dimension reduction problems are in fact equivalent. Consequently we can apply all existing dimension reduction methods to measurement error regression problems. The equivalence holds exactly for multivariate normal predictors, and approximately for arbitrary predictors. We also characterize the rate of convergence for the surrogate dimension reduction estimators. Finally, we apply several dimension reduction methods to real and simulated data sets involving measurement error to compare their performances.

Article information

Ann. Statist., Volume 35, Number 5 (2007), 2143-2172.

First available in Project Euclid: 7 November 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G08: Nonparametric regression 62H12: Estimation

Central spaces central mean space invariance regression graphics surrogate predictors and response weak convergence in probability


Li, Bing; Yin, Xiangrong. On surrogate dimension reduction for measurement error regression: An invariance law. Ann. Statist. 35 (2007), no. 5, 2143--2172. doi:10.1214/009053607000000172.

Export citation


  • Billingsley, P. (1968). Convergence of Probability Measures. Wiley, New York.
  • Carroll, R. J. (1989). Covariance analysis in generalized linear measurement error models. Stat. Med. 8 1075–1093.
  • Carroll, R. J. and Li, K.-C. (1992). Measurement error regression with unknown link: Dimension reduction and data visualization. J. Amer. Statist. Assoc. 87 1040–1050.
  • Carroll, R. J., Ruppert, D. and Stefanski, L. A. (1995). Measurement Error in Nonlinear Models. Chapman and Hall, London.
  • Carroll, R. J. and Stefanski, L. A. (1990). Approximate quasi-likelihood estimation in models with surrogate predictors. J. Amer. Statist. Assoc. 85 652–663.
  • Cook, R. D. (1994). Using dimension-reduction subspaces to identify important inputs in models of physical systems. In ASA Proc. Section on Physical and Engineering Sciences 18–25. Amer. Statist. Assoc., Alexandria, VA.
  • Cook, R. D. (1998). Regression Graphics: Ideas for Studying Regressions Through Graphics. Wiley, New York.
  • Cook, R. D. (2007). Fisher lecture: Dimension reduction in regression. Statist. Sci. 22 1–26.
  • Cook, R. D. and Li, B. (2002). Dimension reduction for the conditional mean in regression. Ann. Statist. 30 455–474.
  • Cook, R. D. and Li, B. (2004). Determining the dimension of iterative Hessian transformation. Ann. Statist. 32 2501–2531.
  • Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction,” by K.-C. Li. J. Amer. Statist. Assoc. 86 328–332.
  • Diaconis, P. and Freedman, D. (1984). Asymptotics of graphical projection pursuit. Ann. Statist. 12 793–815.
  • Duan, N. and Li, K.-C. (1991). Slicing regression: A link-free regression method. Ann. Statist. 19 505–530.
  • Fernholz, L. T. (1983). von Mises Calculus for Statistical Functionals. Lecture Notes in Statist. 19. Springer, New York.
  • Fuller, W. A. (1987). Measurement Error Models. Wiley, New York.
  • Fung, W. K., He, X., Liu, L. and Shi, P. (2002). Dimension reduction based on canonical correlation. Statist. Sinica 12 1093–1113.
  • Hall, P. and Li, K.-C. (1993). On almost linearity of low-dimensional projections from high-dimensional data. Ann. Statist. 21 867–889.
  • Li, B., Zha, H. and Chiaromonte, F. (2005). Contour regression: A general approach to dimension reduction. Ann. Statist. 33 1580–1616.
  • Li, K.-C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316–342.
  • Li, K.-C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein's lemma. J. Amer. Statist. Assoc. 87 1025–1039.
  • Li, K.-C. and Duan, N. (1989). Regression analysis under link violation. Ann. Statist. 17 1009–1052.
  • Lue, H.-H. (2004). Principal Hessian directions for regression with measurement error. Biometrika 91 409–423.
  • Pepe, M. S. and Fleming, T. R. (1991). A nonparametric method for dealing with mismeasured covariate data. J. Amer. Statist. Assoc. 86 108–113.
  • Warren, R. D., White, J. K. and Fuller, W. A. (1974). An errors-in-variables analysis of managerial role performance. J. Amer. Statist. Assoc. 69 886–893.
  • Xia, Y., Tong, H., Li, W. K. and Zhu, L.-X. (2002). An adaptive estimation of dimension reduction space. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 363–410.
  • Yin, X. and Cook, R. D. (2002). Dimension reduction for the conditional $k$-th moment in regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 159–175.
  • Yin, X. and Cook, R. D. (2003). Estimating central subspaces via inverse third moments. Biometrika 90 113–125.
  • Yin, X. and Cook, R. D. (2004). Dimension reduction via marginal fourth moments in regression. J. Comput. Graph. Statist. 13 554–570.