Electronic Journal of Statistics

Local linear smoothing for sparse high dimensional varying coefficient models

Eun Ryung Lee and Enno Mammen

Full-text: Open access

Abstract

Varying coefficient models are useful generalizations of parametric linear models. They allow for parameters that depend on a covariate or that develop in time. They have a wide range of applications in time series analysis and regression. In time series analysis they have turned out to be a powerful approach to infer on behavioral and structural changes over time. In this paper, we are concerned with high dimensional varying coefficient models including the time varying coefficient model. Most studies in high dimensional nonparametric models treat penalization of series estimators. On the other side, kernel smoothing is a well established, well understood and successful approach in nonparametric estimation, in particular in the time varying coefficient model. But not much has been done for kernel smoothing in high-dimensional models. In this paper we will close this gap and we develop a penalized kernel smoothing approach for sparse high-dimensional models. The proposed estimators make use of a novel penalization scheme working with kernel smoothing. We establish a general and systematic theoretical analysis in high dimensions. This complements recent alternative approaches that are based on basis approximations and that allow more direct arguments to carry over insights from high-dimensional linear models. Furthermore, we develop theory not only for regression with independent observations but also for local stationary time series in high-dimensional sparse varying coefficient models. The development of theory for local stationary processes in a high-dimensional setting creates technical challenges. We also address issues of numerical implementation and of data adaptive selection of tuning parameters for penalization.The finite sample performance of the proposed methods is studied by simulations and it is illustrated by an empirical analysis of NASDAQ composite index data.

Article information

Source
Electron. J. Statist. Volume 10, Number 1 (2016), 855-894.

Dates
Received: July 2015
First available in Project Euclid: 6 April 2016

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1459967425

Digital Object Identifier
doi:10.1214/16-EJS1110

Mathematical Reviews number (MathSciNet)
MR3486419

Zentralblatt MATH identifier
1349.62313

Subjects
Primary: 60K35: Interacting random processes; statistical mechanics type models; percolation theory [See also 82B43, 82C43] 60K35: Interacting random processes; statistical mechanics type models; percolation theory [See also 82B43, 82C43]
Secondary: 60K35: Interacting random processes; statistical mechanics type models; percolation theory [See also 82B43, 82C43]

Keywords
Sparse estimation local stationary time series high-dimensional data local stationarity time varying coefficient models kernel method local linear method penalized methods BIC oracle inequality oracle property partially linear varying coefficient model semiparametric model consistent structural identification second order cone programming

Citation

Lee, Eun Ryung; Mammen, Enno. Local linear smoothing for sparse high dimensional varying coefficient models. Electron. J. Statist. 10 (2016), no. 1, 855--894. doi:10.1214/16-EJS1110. https://projecteuclid.org/euclid.ejs/1459967425.


Export citation

References

  • [1] P. Bühlmann and S. van de Geer., Statistics for high-dimensional data. Springer, 2011.
  • [2] Z.. Cai. Trending time-varying coefficient time series models with serially correlated errors., Journal of Econometrics, 136:163–188, 2007.
  • [3] B. Chen and Y. Hong. Testing for smooth structural changes in time series models via nonparametric regression., Econometrica, 80 :1157–1183, 2012.
  • [4] J. Chen and Z. Chen. Extended bayesian information criteria for model selection with large model spaces., Biometrika, 95:759–771, 2008.
  • [5] M.-Y. Cheng, T. Honda, J. Li, and H. Peng. Nonparametric independence screening and structural identification for ultra-high dimensional longitudinal data., Annals of Statistics, 42 :1819–1849, 2014.
  • [6] M.-Y. Cheng, T. Honda, and J.-T. Zhang. Forward variable selection for sparse ultra-high dimensional varying coefficient models., J. Amer. Statisti. Assoc., forthcoming, 2015.
  • [7] CVX Research, Inc. CVX:Matlab software for disciplined convex programming, version 2.0 beta., http://cvxr.com/cvx, Sept. 2012.
  • [8] R. Dahlhaus. On the Kullback-Leibler information divergence of locally stationary processes., Stoch. Proc. Appl, 62:139–168, 1996.
  • [9] R. Dahlhaus. Fitting time series models to nonstationary processes., Annals of Statistics, 25:1–37, 1997.
  • [10] R. Dahlhaus, M. H. Neumann, and R. V. Sachs. Nonlinear wavelet estimation of time-varying autoregressive processes., Bernoulli, 5:873–906, 1999.
  • [11] L. Dümbgen, S. van de Geer, M. Veraar, and J. Wellner. Nemirovski’s inequalities revisited., Amer. Math. Monthly, 117:138–160, 2010.
  • [12] J. Fan and R. Li. Variable selection via nonconcave penalized likelihood and its oracle properties., J. Amer. Statist. Assoc., 96 :1348–1360, 2001.
  • [13] J. Fan and W. Zhang. Statistical methods with varying coefficient models., Statist. and its Interface, 1:179–195, 2008.
  • [14] J. Fan, J. Lv, and L. Qi. Sparse high-dimensional models in economics., Annual Review of Economics, 3:291–317, 2011.
  • [15] J. Fan, Y. Ma, and W. Dai. Nonparametric independence screening in sparse ultra-high dimensional varying coefficient models., J. Amer. Statist. Assoc., 109 :1270–1284, 2014.
  • [16] J. Friedman, T. Hastie, and R. Tibshirani. Regularization paths for generalized linear models via coordinate descent., J. Statist. Software, 33 (1), 2010.
  • [17] P. Fryzlewicz and S. S. Rao. Mixing properties of ARCH and time-varying ARCH processes., Bernoulli, 17:320–346, 2011.
  • [18] M. Grant and S. Boyd. Graph implementations for nonsmooth convex programs. In V. Blondel, S. Boyd, and H. Kimura, editors, Recent Advances in Learning and Control, Lecture Notes in Control and Information Sciences, pages 95–110. Springer-Verlag Limited, 2008.
  • [19] T. Hu and Y. Xia. Adaptive semi-varying coefficient model selection., Statistica Sinica, 22:575–599, 2012.
  • [20] Y. Kim, S. Kwon, and H. Choi. Consistent model selection criteria on high dimensions., J. Machine Learning Research, 13 :1037–1057, 2012.
  • [21] O. Klopp and M. Pensky. Sparse high-dimensional varying coefficient model:non-asymptotic minimax study., Annals of Statistics., 43 :1273–1299, 2015.
  • [22] D. Kong, H. D. Bondell, and Y. Wu. Domain selection for the varying coefficient model via local polynomial regression., Comput. Statist. and Data Analysis, 83:236–250, 2015.
  • [23] E. R. Lee, H. Noh, and B. U. Park. Model selection via bayesian information criterion for quantile regression models., J. Amer. Statist. Assoc., 109:216–229, 2014.
  • [24] H. Lian. Variable selection for high-dimensional generalized varying coefficient models., Statistica Sinica, 22 :1563–1588, 2012.
  • [25] E. Liebscher. Strong convergence of sums of $\alpha$-mixing random variables with applications to density estimation., Stoch. Proc. Appl., 65:69–80, 1996.
  • [26] B. U. Park, E. Mammen, Y. K. Lee, and E. R. Lee. Varying coefficient regression models:A review and new developments., International Statistical Review, 83:36–64, 2015.
  • [27] P. M. Robinson. Nonparametric estimation of time-varying parameters. In P. Hackl, editor, Statistical Analysis and Forecasting of Economic Structural Change, pages 253–264. Springer-Verlag, 1989.
  • [28] P. M. Robinson. Time-varying nonlinear regression. In P. Hackl and A. H. Westland, editors, Economic Structure Change Analysis and Forecasting, pages 179–190. Springer, 1991.
  • [29] R. Tibshirani. Regression shrinkage and selection via the lasso., J. Royal Stat. Soc. B, 58:267–288, 1996.
  • [30] S. van de Geer and P. Bühlmann. On the conditions used to prove oracle results for the Lasso., Elect. J. Statist., 3 :1360–1392, 2009.
  • [31] S. A. van de Geer., Empirical Processes in M-Estimation. Cambridge University Press, 2000.
  • [32] M. Vogt. Nonparametric regression for locally stationary time series., The Annals of Statistics, 40 :2601–2633, 2012.
  • [33] D. Wang and K. B. Kulasekera. Parametric component detection and variable selection in varying-coefficient partially linear models., Journal of Multivariate Analysis, 112:117–129, 2012.
  • [34] H. Wang and C. Leng. Unified LASSO estimation by least squares approximation., Journal of the American Statistical Association, 102 :1418–1429, 2007.
  • [35] H. Wang and Y. Xia. Shrinkage estimation of the varying coefficient model., Journal of the American Statistical Association, 104:747–757, 2009.
  • [36] H. Wang, R. Li, and C.-L. Tsai. Tuning parameter selectors for the smoothly clipped absolute deviation method., Biometrika, 94:553–568, 2007.
  • [37] H. Wang, B. Li, and C. Leng. Shrinkage tuning parameter selection with a diverging number of parameters., J. Royal Stat. Soc. B, 71:671–683, 2009.
  • [38] F. Wei, J. Huang, and H. Li. Variable selection and estimation in high-dimensional varying-coefficient models., Statistica Sinica, 21 :1515–1540, 2011.
  • [39] Y. Xia, W. Zhang, and H. Tong. Efficient estimation for semivarying-coefficient models., Biometrika, 91:661–681, 2004.
  • [40] L. Xue and A. Qu. Variable selection in high-dimensional varying-coefficient models with global optimality., Journal of Machine Learning Research, 13 :1973–1998, 2012.
  • [41] M. Yuan and Y. Lin. Model selection and estimation in regression with grouped variables., J. Royal Stat. Soc. B, B68:49–67, 2006.
  • [42] C. H. Zhang and J. Huang. The sparsity and bias of the LASSO selection in high-dimensional linear regression., Annals of Statistics, 4 :1567–1594, 2008.
  • [43] H. Zhang, G. Cheng, and Y. Liu. Linear or nonlinear? Augomatic structure discovery for partially linear mdoels., Journal of the American Statistical Association, 106 :1099–1112, 2011.
  • [44] T. Zhang and W. B. Wu. Inference of time-varying regression models., Annals of Statistics, 40 :1376–1402., 2012.
  • [45] H. Zou. The adaptive lasso and its oracle properties., Journal of the American Statistical Association, 101 :1418–1429, 2006.
  • [46] H. Zou and R. Li. One-step sparse estimates in nonconcave penalized likelihood models., Annals of Statistics, 36 :1509–1533, 2008.