The Annals of Statistics

Asymptotic Normality of the Recursive Kernel Regression Estimate Under Dependence Conditions

Abstract

For $i = 1,2,\ldots$, let $X_i$ and $Y_i$ be $\mathbb{R}^d$-valued ($d \geq 1$ integer) and $\mathbb{R}$-valued, respectively, random variables, and let $\{(X_i, Y_i)\}, i \geq 1$, be a strictly stationary and $\alpha$-mixing stochastic process. Set $m(x) = \mathscr{E}(Y_1\mid X_1 = x), x \in \mathbb{R}^d$, and let $\hat{m}_n(x)$ be a certain recursive kernel estimate of $m(x)$. Under suitable regularity conditions and as $n \rightarrow \infty$, it is shown that $\hat{m}_n(x)$, properly normalized, is asymptotically normal with mean 0 and a specified variance. This result is established, first under almost sure boundedness of the $Y_i$'s, and then by replacing boundedness by continuity of certain truncated moments. It is also shown that, for distinct points $x_1,\ldots,x_N$ in $\mathbb{R}^d (N \geq 2$ integer), the joint distribution of the random vector, $(\hat{m}_n(x_1),\ldots,\hat{m}_n(x_N))$, properly normalized, is asymptotically $N$-dimensional normal with mean vector 0 and a specified covariance function.

Article information

Source
Ann. Statist., Volume 20, Number 1 (1992), 98-120.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176348514

Digital Object Identifier
doi:10.1214/aos/1176348514

Mathematical Reviews number (MathSciNet)
MR1150336

Zentralblatt MATH identifier
0925.62171

JSTOR