Abstract and Applied Analysis

Remodeling and Estimation for Sparse Partially Linear Regression Models

Yunhui Zeng, Xiuli Wang, and Lu Lin

Full-text: Open access

Abstract

When the dimension of covariates in the regression model is high, one usually uses a submodel as a working model that contains significant variables. But it may be highly biased and the resulting estimator of the parameter of interest may be very poor when the coefficients of removed variables are not exactly zero. In this paper, based on the selected submodel, we introduce a two-stage remodeling method to get the consistent estimator for the parameter of interest. More precisely, in the first stage, by a multistep adjustment, we reconstruct an unbiased model based on the correlation information between the covariates; in the second stage, we further reduce the adjusted model by a semiparametric variable selection method and get a new estimator of the parameter of interest simultaneously. Its convergence rate and asymptotic normality are also obtained. The simulation results further illustrate that the new estimator outperforms those obtained by the submodel and the full model in the sense of mean square errors of point estimation and mean square prediction errors of model prediction.

Article information

Source
Abstr. Appl. Anal., Volume 2013, Special Issue (2012), Article ID 687151, 11 pages.

Dates
First available in Project Euclid: 26 February 2014

Permanent link to this document
https://projecteuclid.org/euclid.aaa/1393450539

Digital Object Identifier
doi:10.1155/2013/687151

Mathematical Reviews number (MathSciNet)
MR3034901

Zentralblatt MATH identifier
06161362

Citation

Zeng, Yunhui; Wang, Xiuli; Lin, Lu. Remodeling and Estimation for Sparse Partially Linear Regression Models. Abstr. Appl. Anal. 2013, Special Issue (2012), Article ID 687151, 11 pages. doi:10.1155/2013/687151. https://projecteuclid.org/euclid.aaa/1393450539


Export citation

References

  • X. T. Shen, H.-C. Huang, and J. Ye, “Inference after model selection,” Journal of the American Statistical Association, vol. 99, no. 467, pp. 751–762, 2004.
  • Y. Gai, L. Lin, and X. Wang, “Consistent inference for biased sub-model of high-dimensional partially linear model,” Journal of Statistical Planning and Inference, vol. 141, no. 5, pp. 1888–1898, 2011.
  • Y. Zeng, L. Lin, and X. Wang, “Multi-step-adjustment consistent inference for biased sub-model of multidimensional linear regression,” Acta Mathematica Scientia, vol. 32, no. 6, pp. 1019–1031, 2012 (Chinese).
  • P. Zhao and L. Xue, “Variable selection for semiparametric varying coefficient partially linear models,” Statistics & Probability Letters, vol. 79, no. 20, pp. 2148–2157, 2009.
  • J. Fan and R. Li, “Variable selection via nonconcave penalized likelihood and its oracle properties,” Journal of the American Statistical Association, vol. 96, no. 456, pp. 1348–1360, 2001.
  • L. Wang, G. Chen, and H. Li, “Group SCAD regression analysis for microarray time course gene expression data,” Bioinformatics, vol. 23, no. 12, pp. 1486–1494, 2007.
  • L. Wang, H. Li, and J. Z. Huang, “Variable selection in nonparametric varying-coefficient models for analysis of repeated measurements,” Journal of the American Statistical Association, vol. 103, no. 484, pp. 1556–1569, 2008.
  • E. F. Simas Filho and J. M. Seixas, “Nonlinear independent component analysis: theoretical review and applications,” Learning and Nonlinear Models, vol. 5, no. 2, pp. 99–120, 2007.
  • J. Fan, Y. Feng, and R. Song, “Nonparametric independence screening in sparse ultra-high-dimensional additive models,” Journal of the American Statistical Association, vol. 106, no. 494, pp. 544–557, 2011.
  • C. J. Stone, “Optimal global rates of convergence for nonparametric regression,” The Annals of Statistics, vol. 10, no. 4, pp. 1040–1053, 1982.
  • T. W. Anderson, An Introduction to Multivariate Statistical Analysis, John Wiley & Sons, 3rd edition, 2003.
  • P. Rütimann and P. Bühlmann, “High dimensional sparse covariance estimation via directed acyclic graphs,” Electronic Journal of Statistics, vol. 3, pp. 1133–1160, 2009.
  • T. Cai and W. D. Liu, “Adaptive thresholding for sparse covariance matrix estimation,” Journal of the American Statistical Association, vol. 106, no. 494, pp. 672–684, 2011.
  • H. Zou, “The adaptive lasso and its oracle properties,” Journal of the American Statistical Association, vol. 101, no. 476, pp. 1418–1429, 2006.
  • A. Hyvärinen and E. Oja, “A fast fixed-point algorithm for independent component analysis,” Neural Computation, vol. 9, no. 7, pp. 1483–1492, 1997.
  • W. Härdle, H. Liang, and J. T. Gao, Partially Linear Models, Physica, Heidelberg, Germany, 2000.