## Abstract

Consider a random sample of $N$ observations $x_1, x_2, \cdots, x_N,$ from a universe of mean $\mu$ and variance $\sigma^2$. Let $m$ and $s^2$ be the sample mean and variance respectively: \begin{equation*}\tag{1} m = \frac{1}{N} \sum^N_{i=1} x_i,\quad s^2 = \frac{1}{N} \sum^N_{i=1} (x_i - m)^2.\end{equation*} It is shown that the following conservative confidence interval holds for $\mu$: \begin{equation*}\tag{2} \operatorname{Prob}\{ (m - \mu)^2 \leqq s^2/(N - 1) + \lambda\sigma^2\sqrt{2/N(N - 1)}\} > 1 - \lambda^{-2},\end{equation*} where $\lambda$ is any positive constant. Inequality (2) also holds if, in the braces, $\lambda$ is replaced by $\sqrt{\lambda^2 - 1}$, with $\lambda \geqq 1$. Inequality (2) is much more efficient on the average than Tchebychef's inequality for the mean, namely, \begin{equation*}\tag{3} \operatorname{Prob} \{(m - \mu)^2 \leqq \lambda^2\sigma^2/N\} > 1 - \lambda^{-2},\end{equation*} yet (2) and (3) are both distribution-free, requiring only knowledge about $\sigma^2$. At the $1 - \lambda^{-2} = .99$ level of confidence, the expected value of the right member in the braces of (2) is only about $1/6$ the corresponding member of (3); at the .999 level of confidence the ratio is about $1/20$. A more general inequality than (2) is developed, also involving only the single parameter $\sigma^2$.*

## Citation

Louis Guttman. "A Distribution-Free Confidence Interval for the Mean." Ann. Math. Statist. 19 (3) 410 - 413, September, 1948. https://doi.org/10.1214/aoms/1177730207

## Information