The Annals of Statistics

Optimal Global Rates of Convergence for Nonparametric Regression

Charles J. Stone

Abstract

Consider a $p$-times differentiable unknown regression function $\theta$ of a $d$-dimensional measurement variable. Let $T(\theta)$ denote a derivative of $\theta$ of order $m$ and set $r = (p - m)/(2p + d)$. Let $\hat{T}_n$ denote an estimator of $T(\theta)$ based on a training sample of size $n$, and let $\| \hat{T}_n - T(\theta)\|_q$ be the usual $L^q$ norm of the restriction of $\hat{T}_n - T(\theta)$ to a fixed compact set. Under appropriate regularity conditions, it is shown that the optimal rate of convergence for $\| \hat{T}_n - T(\theta)\|_q$ is $n^{-r}$ if $0 < q < \infty$; while $(n^{-1} \log n)^r$ is the optimal rate if $q = \infty$.

Article information

Source
Ann. Statist. Volume 10, Number 4 (1982), 1040-1053.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
http://projecteuclid.org/euclid.aos/1176345969

JSTOR

Digital Object Identifier
doi:10.1214/aos/1176345969

Mathematical Reviews number (MathSciNet)
MR673642

Zentralblatt MATH identifier
0511.62048

Subjects
Primary: 62G20: Asymptotic properties
Secondary: 62G05: Estimation

Citation

Stone, Charles J. Optimal Global Rates of Convergence for Nonparametric Regression. Ann. Statist. 10 (1982), no. 4, 1040--1053. doi:10.1214/aos/1176345969. http://projecteuclid.org/euclid.aos/1176345969.