Abstract
Let $X_1, X_2, \cdots$ be independent observations from a population with mean $\mu$ and variance $\sigma^2$, and suppose that given a sample of size $n$ one wishes to estimate $\mu$ by $\bar{X}_n$, subject to the loss function $L_n = A\sigma^{2\beta-2}(\bar{X}_n - \mu)^2 + n, A > 0, \beta > 0$. If $\sigma$ is known, then the optimal fixed sample size $n_0$ for minimizing the risk $R_n = EL_n$ can be computed, but if $\sigma$ is unknown there is no fixed sample size procedure that will achieve the minimum risk. For the case when $\sigma$ is unknown, a number of authors have investigated the performance of sequential estimation procedures designed to come close to attaining the minimum risk $R_{n_0}$. In this paper it is shown that for the class of sequential estimation procedures with stopping rules $T_A = \inf\{n \geq n_A: n^{-1} \sum^n_{i=1} (X_i - \bar{X}_n)^2 \leq A^{-1/\beta}n^{2/\beta}\}$ the regret $R_{T_A} - R_{n_0}$ remains bounded as $A\rightarrow \infty$, under suitable assumptions on the moments of $X_1$ and the delay $n_A$, but (unlike previous results of bounded regret) without any assumption about the type of distribution of $X_1$.
Citation
Y. S. Chow. A. T. Martinsek. "Bounded Regret of a Sequential Procedure for Estimation of the Mean." Ann. Statist. 10 (3) 909 - 914, September, 1982. https://doi.org/10.1214/aos/1176345880
Information