Abstract
The problem considered is that of testing sequentially between two separated composite hypotheses concerning the mean of a normal distribution with known variance. The parameter space is the real line, on which is assumed an a priori distribution, $W,$ with full support. A family $\{\delta(c)\}$ of sequential tests is defined and shown to be asymptotically Bayes, as the cost, $c$, per observation tends to zero, relative to a large class of fully supported a priori distributions. The ratio of the integrated risk of the Bayes procedure to that of $\delta(c)$ is shown to be $1 - 0(\log\log c^{-1}/\log c^{-1})$, as $c$ tends to zero, for every $W.$
Citation
Gloria C. Zerdy. "Risk of Asymptotically Optimum Sequential Tests." Ann. Statist. 8 (5) 1110 - 1122, September, 1980. https://doi.org/10.1214/aos/1176345148
Information