Abstract
Let $(X, Y), (X_1, Y_1), \cdots, (X_n, Y_n)$ be independent identically distributed random vectors from $R^d \times \{0, 1\}$, and let $\hat{Y}$ be the $k$-nearest neighbor estimate of $Y$ from $X$ and the $(X_i, Y_i)$'s. We show that for all distributions of $(X, Y)$, the limit of $L_n = P(\hat{Y} \neq Y)$ exists and satisfies $\lim_{n\rightarrow\infty} L_n \leq (1 + a_k)R^\ast, a_k = \frac{\alpha \sqrt k}{k - 3.25}\big(1 + \frac{\beta}{\sqrt{k-3}}\big), k \text{odd}, k \geq 5,$ where $R^\ast$ is the Bayes probability of error and $\alpha = 0.3399 \cdots$ and $\beta = 0.9749 \cdots$ are universal constants. This bound is shown to be best possible in a certain sense.
Citation
Luc Devroye. "On the Asymptotic Probability of Error in Nonparametric Discrimination." Ann. Statist. 9 (6) 1320 - 1327, November, 1981. https://doi.org/10.1214/aos/1176345648
Information