Brazilian Journal of Probability and Statistics

Dimension reduction based on conditional multiple index density function

Jun Zhang, Baohua He, Tao Lu, and Songqiao Wen

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

In this paper, a dimension reduction method is proposed by using the first derivative of the conditional density function of response given predictors. To estimate the central subspace, we propose a direct methodology by taking expectation of the product of predictor and kernel function about response, which helps to capture the directions in the conditional density function. The consistency and asymptotic normality of the proposed estimation methodology are investigated. Furthermore, we conduct some simulations to evaluate the performance of our proposed method and compare with existing methods, and a real data set is analyzed for illustration.

Article information

Source
Braz. J. Probab. Stat., Volume 32, Number 4 (2018), 851-872.

Dates
Received: March 2017
Accepted: July 2017
First available in Project Euclid: 17 August 2018

Permanent link to this document
https://projecteuclid.org/euclid.bjps/1534492905

Digital Object Identifier
doi:10.1214/17-BJPS370

Mathematical Reviews number (MathSciNet)
MR3845033

Zentralblatt MATH identifier
06979604

Keywords
Central subspace conditional density function dimensional reduction kernel function

Citation

Zhang, Jun; He, Baohua; Lu, Tao; Wen, Songqiao. Dimension reduction based on conditional multiple index density function. Braz. J. Probab. Stat. 32 (2018), no. 4, 851--872. doi:10.1214/17-BJPS370. https://projecteuclid.org/euclid.bjps/1534492905


Export citation

References

  • Bi, X. and Qu, A. (2015). Sufficient dimension reduction for longitudinal data. Statistica Sinica 251, 787–807.
  • Chen, C. H. and Li, K. C. (1998). Can SIR be as popular as multiple linear regression? Statistica Sinica 8, 289–316.
  • Chen, X., Zou, C. and Cook, D. R. (2010). Coordinate-independent sparse sufficient dimension reduction and variable selection. The Annals of Statistics 38, 3696–3723.
  • Cook, R. D. (1994). On the interpretation of regression plots. Journal of the American Statistical Association 89, 177–189.
  • Cook, R. D. (1998). Regression Graphics. Ideas for Studying Regressions Through Graphics. New York: Wiley.
  • Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction”, by K.-C. Li. Journal of the American Statistical Association 86, 328–332.
  • Cui, X., Härdle, W. K. and Zhu, L. (2009). The EFM approach for single-index models. The Annals of Statistics 39, 1658–1688.
  • Deng, J. and Wang, Q. (2017). Dimension reduction estimation for probability density with data missing at random when covariables are present. Journal of Statistical Planning and Inference 181, 11–29.
  • Ding, X. and Wang, Q. (2011). Fusion-refinement procedure for dimension reduction with missing response at random. Journal of the American Statistical Association 106, 1193–1207.
  • Eaton, M. and Tyler, D. (1991). On Wielandt’s inequality and its applications to the asymptotic distribution of the eigenvalues of a random symmetric matrix. The Annals of Statistics 19, 260–271.
  • Fan, J., Yao, Q. and Tong, H. (1996). Estimation of conditional densities and sensitivity measures in nonlinear dynamical systems. Biometrika 83, 189–196.
  • Guo, X., Wang, T., Xu, W. and Zhu, L. (2014). Dimension reduction with missing response at random. Computational Statistics & Data Analysis 69, 228–242.
  • Hotelling, H. (1936). Relations between two sets of variates. Biometrika 28, 321–377.
  • Lansangan, J. R. G. and Barrios, E. B. (2017). Simultaneous dimension reduction and variable selection in modeling high dimensional data. Computational Statistics & Data Analysis 112, 242–256.
  • Li, B. and Wang, S. L. (2007). On directional regression for dimension reduction. Journal of the American Statistical Association 102, 997–1008.
  • Li, B. and Yin, X. (2007). On surrogate dimension reduction for measurement error regression: An invariance law. The Annals of Statistics 35, 2143–2172.
  • Li, B., Zha, H. and Chiaromonte, F. (2005). Contour regression: A general approach to dimension reduction. The Annals of Statistics 33, 1580–1616.
  • Li, G., Peng, H., Dong, K. and Tong, T. (2014). Simultaneous confidence bands and hypothesis testing for single-index models. Statistica Sinica 24, 937–955.
  • Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion). Journal of the American Statistical Association 86, 316–342.
  • Li, K. C. (1992). On principal Hessian directions for data visuallization and dimension reduction: Another application of Stein’s lemma. Journal of the American Statistical Association 87, 1025–1039.
  • Li, K. C. and Duan, N. (1989). Regression analysis under link violation. The Annals of Statistics 17, 1009–1052.
  • Li, L. and Yin, X. (2009). Longitudinal data analysis using sufficient dimension reduction method. Computational Statistics & Data Analysis 53, 4106–4115.
  • Liang, H., Liu, X., Li, R. and Tsai, C. L. (2010). Estimation and testing for partially linear single-index models. The Annals of Statistics 38, 3811–3836.
  • Luo, W., Zhu, Y. and Ghosh, D. (2017). On estimating regression-based causal effects using sufficient dimension reduction. Biometrika 104, 51–65.
  • Peng, H. and Huang, T. (2011). Penalized least squares for single index models. Journal of Statistical Planning and Inference 141, 1362–1379.
  • Samarov, A. M. (1993). Exploring regression structure using nonparametric functional estimation. Journal of the American Statistical Association 88, 836–847.
  • Serfling, R. J. (1980). Approximation Theorems of Mathematical Statistics. New York: Wiley.
  • Sheng, W. and Yin, X. (2016). Sufficient dimension reduction via distance covariance. Journal of Computational and Graphical Statistics 25, 91–104.
  • Stein, C. (1981). Estimation the mean of a multivariate normal distribution. The Annals of Statistics 9, 1135–1151.
  • Wang, T. and Zhu, L. X. (2013). Sparse sufficient dimension reduction using optimal scoring. Computational Statistics & Data Analysis 57, 223–232.
  • Wu, Y. and Li, L. (2011). Asymptotic properties of sufficient dimension reduction with a diverging number of predictors. Statistica Sinica 21, 707–730.
  • Xia, Y. (2007). A constructive approach to the estimation of dimension reduction directions. The Annals of Statistics 35, 2654–2690.
  • Xia, Y., Tong, H., Li, W. K. and Zhu, L. X. (2002). An adaptive estimation of dimension reduction (with discussion). Journal of the Royal Statistical Society, Series B, Statistical Methodology 64, 363–410.
  • Ye, Z. and Weiss, R. E. (2003). Using the bootstrap to select one of a new class of dimension reduction methods. Journal of the American Statistical Association 98, 968–979.
  • Yoshida, T. (2017). Nonlinear surface regression with dimension reduction method. AStA Advances in Statistical Analysis 101, 29–50.
  • Zhou, J. and Zhu, L. (2016). Principal minimax support vector machine for sufficient dimension reduction with contaminated data. Computational Statistics & Data Analysis 94, 33–48.
  • Zhu, L. X., Miao, B. Q. and Peng, H. (2006). Sliced inverse regression with large dimensional covariates. Journal of the American Statistical Association 101, 630–643.
  • Zhu, X., Chen, F., Guo, X. and Zhu, L. X. (2016). Heteroscedasticity testing for regression models: A dimension reduction-based model adaptive approach. Computational Statistics & Data Analysis 103, 263–283.
  • Zhu, X., Guo, X. and Zhu, L. X. (2017). An adaptive-to-model test for partially parametric single-index models. Statistics and Computing 27, 1193–1204.