Abstract
Let $(X, Y)$ be an $\mathbb{R}^d \times \mathbb{R}$-valued random vector and let $(X_1, Y_1), \cdots, (X_n, Y_n)$ be a random sample drawn from its distribution. We study the consistency properties of the kernel estimate $m_n(x)$ of the regression function $m(x) = E\{Y\mid X = x\}$ that is defined by $m_n(x) = \Sigma^n_{i=1} Y_ik((X_i - x)/h_n)/\Sigma^n_{i=1}k((X_i - x)/h_n)$ where $k$ is a bounded nonnegative function on $\mathbb{R}^d$ with compact support and $\{h_n\}$ is a sequence of positive numbers satisfying $h_n \rightarrow_n0, nh^d_n \rightarrow_n\infty$. It is shown that $E\{\int|m_n(x) - m(x)|^p\mu(dx)\} \rightarrow_n 0$ whenever $E\{|Y|^p\} < \infty(p \geqslant 1)$. No other restrictions are placed on the distribution of $(X, Y)$. The result is applied to verify the Bayes risk consistency of the corresponding discrimination rules.
Citation
Luc P. Devroye. T. J. Wagner. "Distribution-Free Consistency Results in Nonparametric Discrimination and Regression Function Estimation." Ann. Statist. 8 (2) 231 - 239, March, 1980. https://doi.org/10.1214/aos/1176344949
Information