The Annals of Statistics

Nonparametric Regression with Errors in Variables

Jianqing Fan and Young K. Truong

Full-text: Open access

Abstract

The effect of errors in variables in nonparametric regression estimation is examined. To account for errors in covariates, deconvolution is involved in the construction of a new class of kernel estimators. It is shown that optimal local and global rates of convergence of these kernel estimators can be characterized by the tail behavior of the characteristic function of the error distribution. In fact, there are two types of rates of convergence according to whether the error is ordinary smooth or super smooth. It is also shown that these results hold uniformly over a class of joint distributions of the response and the covariate, which is rich enough for many practical applications. Furthermore, to achieve optimality, we show that the convergence rates of all possible estimators have a lower bound possessed by the kernel estimators.

Article information

Source
Ann. Statist., Volume 21, Number 4 (1993), 1900-1925.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176349402

Digital Object Identifier
doi:10.1214/aos/1176349402

Mathematical Reviews number (MathSciNet)
MR1245773

Zentralblatt MATH identifier
0791.62042

JSTOR
links.jstor.org

Subjects
Primary: 62G20: Asymptotic properties
Secondary: 62G05: Estimation 62J99: None of the above, but in this section

Keywords
Errors in variables nonparametric regression deconvolution kernel estimator optimal rates of convergence

Citation

Fan, Jianqing; Truong, Young K. Nonparametric Regression with Errors in Variables. Ann. Statist. 21 (1993), no. 4, 1900--1925. doi:10.1214/aos/1176349402. https://projecteuclid.org/euclid.aos/1176349402


Export citation