Abstract
The objective in nonparametric regression is to infer a function $m(x)$ on the basis of a finite collection of noisy pairs $\{(X_i, m(X_i) + N_i)\}^n_{i=1}$, where the noise components $N_i$ satisfy certain lenient assumptions and the domain points $X_i$ are selected at random. It is known a priori only that $m$ is a member of a nonparametric class of functions (that is, a class of functions like $C\lbrack 0, 1\rbrack$ which, under customary topologies, does not admit a homeomorphic indexing by a subset of a Euclidean space). The main theoretical contribution of this study is to derive uniform convergence bounds and uniform consistency on bounded intervals for the Nadaraya-Watson kernel estimator and its derivatives. Also, we obtain the corresponding convergence results for the Priestly-Chao estimator in the case that the domain points are nonrandom. With these developments we are able to apply nonparametric regression methodology to the problem of identifying noisy time-varying linear systems.
Citation
E. Schuster. S. Yakowitz. "Contributions to the Theory of Nonparametric Regression, with Application to System Identification." Ann. Statist. 7 (1) 139 - 149, January, 1979. https://doi.org/10.1214/aos/1176344560
Information