Abstract
In the present paper we will improve the results concerning the rate of convergence of the error of second kind of the Neyman-Pearson test if the Kullback-Leibler information $K(P_0,P_1)$ is infinite. It is pointed out that in certain cases the sequence $exp(-q_\infty,n)$ is the correct rate of convergence if $-q_\infty,n$ denotes the logarithm of the critical value of the Neyman-Pearson test of level and sample size n. Therefore we generalize the classical results of Stein, Chernoff, and Rao which deal with the error probability of second kind and state that $q_\infty,n\tilde nK(P_0,P_1)$ if the Kullback-Leibler information is finite. Moreover the relation between $q_\infty,n$ and the local behavior of the Laplace transform of the log-likelihood distribution with respect to the hypothesis is studied. The results can be applied to one-sided test problems for exponential families if the hypothesis consists of a single point. In this case it may happen that $q_\infty,n$ is of the order $n^{1/p}$ for some p, 0<p<1.
Citation
Arnold Janssen. "Asymptotic Properties of Neyman-Pearson Tests for Infinite Kullback-Leibler Information." Ann. Statist. 14 (3) 1068 - 1079, September, 1986. https://doi.org/10.1214/aos/1176350050
Information