The Annals of Mathematical Statistics

Asymptotically Optimal Bayes and Minimax Procedures in Sequential Estimation

Peter J. Bickel and Joseph A. Yahav

Full-text: Open access


In [4] we introduced a general method for obtaining asymptotically pointwise optimal procedures in sequential analysis when the cost of observation is constant. The validity of this method in both estimation and testing was established in [4] for Koopman-Darmois families, and in [5] for the general case. Section 2 of this paper generalizes Theorem 2.1 of [4] to cover essentially the case of estimation with variable cost of observation. In Section 3 we show that in estimation problems, under a very weak condition, for constant cost of observation, the asymptotically pointwise optimal rules we propose are optimal in the sense of Kiefer and Sacks [9]. The condition given is further investigated in the context of Bayesian sequential estimation in Section 4 and is shown to be satisfied if reasonable estimates based on the method of moments exist. In Section 5 we consider the robustness of our rules under a change of prior. The main result of this section is given by Theorem 5.1. Finally Theorem 5.2 deals with a generalization of Wald's [12] theory of asymptotically minimax rules and an application of that theory to the Bayesian model.

Article information

Ann. Math. Statist., Volume 39, Number 2 (1968), 442-456.

First available in Project Euclid: 27 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier



Bickel, Peter J.; Yahav, Joseph A. Asymptotically Optimal Bayes and Minimax Procedures in Sequential Estimation. Ann. Math. Statist. 39 (1968), no. 2, 442--456. doi:10.1214/aoms/1177698408.

Export citation