Abstract
Let $d$ denote a positive integer, $\|x\| = (x^2_1 + \cdots + x^2_d)^{1/2}$ the Euclidean norm of $x = (x_1, \cdots, x_d) \in \mathbb{R}^d, k$ a nonnegative integer, $\mathscr{C}_k$ the collection of $k$ times continuously differentiable functions on $\mathbb{R}^d$, and $g_k$ the Taylor polynomial of degree $k$ about the origin corresponding to $g \in \mathscr{C}_k$. Let $M$ and $p > k$ denote positive constants and let $U$ be an open neighborhood of the origin of $\mathbb{R}^d$. Let $\mathscr{G}$ denote the collection of functions $g \in \mathscr{C}_k$ such that $|g(x) - g_k(x)| \leq M \|x\|^P$ for $x\in U$. Let $m \leq k$ be a nonnegative integer, let $\theta_0\in\mathscr{C}_m$ and set $\Theta = \{\theta_0 + g:g \in \mathscr{G}\}$. Let $L$ be a linear differential operator of order $m$ on $\mathscr{C}_m$ and set $T(\theta) = L\theta(0)$ for $\theta \in \Theta$. Let $(X, Y)$ be a pair of random variables such that $X$ is $\mathbb{R}^d$ valued and $Y$ is real valued. It is assumed that the distribution of $X$ is absolutely continuous and that its density is bounded away from zero and infinity on $U$. The conditional distribution of $Y$ given $X$ is assumed to be (say) normal, with a conditional variance which is bounded away from zero and infinity on $U$. The regression function of $Y$ on $X$ is assumed to belong to $\Theta$. It is shown that $r = (p - m)/(2p + d)$ is the optimal (uniform) rate of convergence for a sequence $\{\hat{T}_n\}$ of estimators of $T(\theta)$ such that $\hat{T}_n$ is based on a random sample of size $n$ from the distribution of $(X, Y)$. An analogous result is obtained for nonparametric estimators of a density function.
Citation
Charles J. Stone. "Optimal Rates of Convergence for Nonparametric Estimators." Ann. Statist. 8 (6) 1348 - 1360, November, 1980. https://doi.org/10.1214/aos/1176345206
Information