Open Access
2024 High–dimensional local linear regression under sparsity and convex losses
Kin Yap Cheung, Stephen M.S. Lee
Author Affiliations +
Electron. J. Statist. 18(1): 803-847 (2024). DOI: 10.1214/24-EJS2216


Existing works on variable selection and estimation for high-dimensional nonparametric regression focus primarily on modelling a conditional mean function, on which a restrictive additive structure is commonly imposed. We consider a more general framework which covers different types of regression derived from a broad class of convex loss functions, without assuming additivity of the nonparametric regression function to be estimated. A novel penalised local linear regression procedure is proposed for simultaneous variable selection and estimation under this framework. It performs Bridge-penalised local linear regression and regularised bandwidth estimation in a alternating optimisation scheme. The covariate dimension may exceed any polynomial order, while the number of active variables is allowed to grow slowly with sample size. The procedure is shown to be consistent in variable selection and yield a regression function estimator endowed with an oracle property. Simulation and real data examples are presented to illustrate the performance of the proposed method in mean regression, quantile regression and logistic regression problems.

Funding Statement

Supported by grants from the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. 17307819, 17307321).


Download Citation

Kin Yap Cheung. Stephen M.S. Lee. "High–dimensional local linear regression under sparsity and convex losses." Electron. J. Statist. 18 (1) 803 - 847, 2024.


Received: 1 September 2023; Published: 2024
First available in Project Euclid: 26 February 2024

Digital Object Identifier: 10.1214/24-EJS2216

Primary: 62G08

Keywords: convex loss , high–dimensional , local linear regression , Nonparametric regression , Variable selection

Vol.18 • No. 1 • 2024
Back to Top