Open Access
March, 1992 Asymptotic Normality of the Recursive Kernel Regression Estimate Under Dependence Conditions
George G. Roussas, Lanh T. Tran
Ann. Statist. 20(1): 98-120 (March, 1992). DOI: 10.1214/aos/1176348514

Abstract

For $i = 1,2,\ldots$, let $X_i$ and $Y_i$ be $\mathbb{R}^d$-valued ($d \geq 1$ integer) and $\mathbb{R}$-valued, respectively, random variables, and let $\{(X_i, Y_i)\}, i \geq 1$, be a strictly stationary and $\alpha$-mixing stochastic process. Set $m(x) = \mathscr{E}(Y_1\mid X_1 = x), x \in \mathbb{R}^d$, and let $\hat{m}_n(x)$ be a certain recursive kernel estimate of $m(x)$. Under suitable regularity conditions and as $n \rightarrow \infty$, it is shown that $\hat{m}_n(x)$, properly normalized, is asymptotically normal with mean 0 and a specified variance. This result is established, first under almost sure boundedness of the $Y_i$'s, and then by replacing boundedness by continuity of certain truncated moments. It is also shown that, for distinct points $x_1,\ldots,x_N$ in $\mathbb{R}^d (N \geq 2$ integer), the joint distribution of the random vector, $(\hat{m}_n(x_1),\ldots,\hat{m}_n(x_N))$, properly normalized, is asymptotically $N$-dimensional normal with mean vector 0 and a specified covariance function.

Citation

Download Citation

George G. Roussas. Lanh T. Tran. "Asymptotic Normality of the Recursive Kernel Regression Estimate Under Dependence Conditions." Ann. Statist. 20 (1) 98 - 120, March, 1992. https://doi.org/10.1214/aos/1176348514

Information

Published: March, 1992
First available in Project Euclid: 12 April 2007

zbMATH: 0925.62171
MathSciNet: MR1150336
Digital Object Identifier: 10.1214/aos/1176348514

Subjects:
Primary: 62G05
Secondary: 62E20 , 62J02 , 62M09

Keywords: asymptotic joint normality , asymptotic normality , Dependence , recursive kernel regression estimate , Strong mixing

Rights: Copyright © 1992 Institute of Mathematical Statistics

Vol.20 • No. 1 • March, 1992
Back to Top