Open Access
September, 1988 A Lower Bound on the Error in Nonparametric Regression Type Problems
Yannis G. Yatracos
Ann. Statist. 16(3): 1180-1187 (September, 1988). DOI: 10.1214/aos/1176350954

Abstract

Let $(X_1, Y_1), \cdots, (X_n, Y_n)$ be a sample, denote the conditional density of $Y_i\mid X_i = x_i$ as $f(y\mid x_i, \theta(x_i))$ and $\theta$ an element of a metric space $(\Theta, d)$. A lower bound is provided for the $d$-error in estimating $\theta$. The order of the bound depends on the local behavior of the Kullback information of the conditional density. As an application, we consider the case where $\Theta$ is the space of $q$-smooth functions on $\lbrack 0, 1 \rbrack^d$ metrized with the $L_r$ distance, $1 \leq r < \infty$.

Citation

Download Citation

Yannis G. Yatracos. "A Lower Bound on the Error in Nonparametric Regression Type Problems." Ann. Statist. 16 (3) 1180 - 1187, September, 1988. https://doi.org/10.1214/aos/1176350954

Information

Published: September, 1988
First available in Project Euclid: 12 April 2007

zbMATH: 0651.62028
MathSciNet: MR959195
Digital Object Identifier: 10.1214/aos/1176350954

Subjects:
Primary: 62G20

Keywords: Kullback information , lower bound of loss in probability , lower bound on minimax risk , Nonparametric regression , Optimal rates of convergence

Rights: Copyright © 1988 Institute of Mathematical Statistics

Vol.16 • No. 3 • September, 1988
Back to Top