Abstract
Nonparametric regression using locally weighted least squares was first discussed by Stone and by Cleveland. Recently, it was shown by Fan and by Fan and Gijbels that the local linear kernel-weighted least squares regression estimator has asymptotic properties making it superior, in certain senses, to the Nadaraya-Watson and Gasser-Muller kernel estimators. In this paper we extend their results on asymptotic bias and variance to the case of multivariate predictor variables. We are able to derive the leading bias and variance terms for general multivariate kernel weights using weighted least squares matrix theory. This approach is especially convenient when analysing the asymptotic conditional bias and variance of the estimator at points near the boundary of the support of the predictors. We also investigate the asymptotic properties of the multivariate local quadratic least squares regression estimator discussed by Cleveland and Devlin and, in the univariate case, higher-order polynomial fits and derivative estimation.
Citation
D. Ruppert. M. P. Wand. "Multivariate Locally Weighted Least Squares Regression." Ann. Statist. 22 (3) 1346 - 1370, September, 1994. https://doi.org/10.1214/aos/1176325632
Information