Abstract
Existing works on variable selection and estimation for high-dimensional nonparametric regression focus primarily on modelling a conditional mean function, on which a restrictive additive structure is commonly imposed. We consider a more general framework which covers different types of regression derived from a broad class of convex loss functions, without assuming additivity of the nonparametric regression function to be estimated. A novel penalised local linear regression procedure is proposed for simultaneous variable selection and estimation under this framework. It performs Bridge-penalised local linear regression and regularised bandwidth estimation in a alternating optimisation scheme. The covariate dimension may exceed any polynomial order, while the number of active variables is allowed to grow slowly with sample size. The procedure is shown to be consistent in variable selection and yield a regression function estimator endowed with an oracle property. Simulation and real data examples are presented to illustrate the performance of the proposed method in mean regression, quantile regression and logistic regression problems.
Funding Statement
Supported by grants from the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. 17307819, 17307321).
Citation
Kin Yap Cheung. Stephen M.S. Lee. "High–dimensional local linear regression under sparsity and convex losses." Electron. J. Statist. 18 (1) 803 - 847, 2024. https://doi.org/10.1214/24-EJS2216
Information