Open Access
June, 1965 Bernard Friedman's Urn
David A. Freedman
Ann. Math. Statist. 36(3): 956-970 (June, 1965). DOI: 10.1214/aoms/1177700068

Abstract

The case $\beta = 0$ is the famous Polya (1931) Urn; a discussion of its elementary properties can be found in (Feller, 1960, Chapter IV) and (Frechet, 1943). These facts about the Polya Urn are a classical part of the oral tradition, although some have yet to appear in print (see Blackwell and Kendall, 1964). The fractions $(W_n + B_n)^{-1}W_n$ converge with probability 1 to a limiting random variable $Z$, which has a beta distribution with parameters $W_0/\alpha, B_0/\alpha$. Given $Z$, the successive differences $W_{n + 1} - W_n :n \geqq 0$ are conditionally independent and identically distributed, being $\alpha$ with probability $Z$ and 0 with probability $1 - Z$. Proofs are in Section 2. If $\beta > 0$, the situation is radically different. No matter how large $\alpha$ is in comparison with $\beta$, the fractions $(W_n + B_n)^{-1}W_n$ converge to $\frac{1}{2}$ with probability 1. This seemingly paradoxical result can be sharpened in several ways. Abbreviate $\rho$ for $(\alpha + \beta)^{-1}(\alpha - \beta)$. If $\rho > \frac{1}{2}$, it is proved in Section 3 that $(W_n + B_n)^{-\rho}. (W_n - B_n)$ converges with probability 1 to a nondegenerate limiting random variable. This result in turn fails for $\rho \leqq \frac{1}{2}$. If $0 < \rho \leqq \frac{1}{2}$, the sequence $(W_n + B_n)^{-\rho}(W_n - B_n)$ has plus infinity for superior limit and minus infinity for inferior limit, with probability 1. If $\rho < 0$, the sequence $(W_n - B_n)$ has plus infinity for superior limit and minus infinity for inferior limit, with probability 1. In both cases, the tail $\sigma$-field of $(W_n, B_n) :n \geqq 0$ is trivial. If $\rho < \frac{1}{2}$, it is proved in Section 5 that the distribution of $n^{-\frac{1}{2}}(W_n - B_n)$ converges to normal with mean 0 and variance $(\alpha - \beta)^2/(1 - 2\rho)$. When $\rho = \frac{1}{2}$, the last fraction is not defined; but the distribution of $(n \log n)^{-\frac{1}{2}}(W_n - B_n)$ converges to normal with mean 0 and variance $(\alpha - \beta)^2$. The asymptotic normality of $W_n - B_n$ for $\rho \leqq \frac{1}{2}$ was observed by Bernstein (1940). I am grateful to J. A. McFadden for calling this paper to my attention. Consider taking $\alpha = 0$ and $\beta = 1$, so $\rho < 0$. Since $(W_n + B_n)^{-1}W_n$ converges to $\frac{1}{2}$, therefore $W_n - B_n$ is asymptotically like the sum of $n$ independent random variables, each equal to $+1$ with probability $\frac{1}{2}$ and $-1$ with probability $\frac{1}{2}$. It is tempting to conclude that the distribution of $n^{-\frac{1}{2}}(W_n - B_n)$ converges to normal with mean 0 and variance 1. From the preceding paragraph, however, the asymptotic variance is $\frac{1}{3}$. There is an even more startling difference between the asymptotic behavior of $(W_n - B_n) : n \geqq 0$ and that of a coin-tossing game. Let $X_n :n \geqq 1$ be independent and $\pm 1$ with probability $\frac{1}{2}$ each. Let $S_n = X_1 + \cdots + X_n, S_{j/n,n} = n^{-\frac{1}{2}}S_j$. Define $S_{t,n}$ for $0 \leqq t \leqq 1$ and $nt$ not integral by linear interpolation. By the celebrated Invariance Principle of Donsker (1951), the law of $\{S_{t,n} :0 \leqq t \leqq 1\}$ converges in a strong way to the law of a Brownian motion. However, for $\rho < \frac{1}{2}$ suppose we define $Z_{j/n,n} = n^{-\frac{1}{2}}(W_j - B_j)$ and $\{Z_{t,n} :0 \leqq t \leqq 1\}$ by linear interpolation. The law of $\{Z_{t,n} :0 \leqq t \leqq 1\}$ converges in the sense of the Invariance Principle to the law of a process $\{Z_t :0 \leqq t \leqq 1\}$. Now $Z_t$ is normal with mean 0 and variance $(1 - 2\rho)^{-1}(\alpha - \beta)^2t$. But $\{Z_t :0 \leqq t \leqq 1\}$, though Gaussian, does not have independent increments. On the other hand, $\{t^{-\rho}Z_t :0 \leqq t \leqq 1\}$ is a nonhomogeneous Brownian motion. If $\rho = \frac{1}{2}$, it is necessary to put $Z_{j/n,n} = (n \log n)^{-\frac{1}{2}}(W_j - B_j)$. In the limit, $Z_t = t^{\frac{1}{2}}Z_1$, where $Z_1$ is normal with mean 0 and variance $(\alpha - \beta)^2$. These results were obtained independently by K. Ito and myself. Details will be given in a future joint paper. D. Ornstein has obtained this very intuitive proof that $(W_n + B_n)^{-1}W_n$ converges to $\frac{1}{2}$ with probability 1 for $\beta > 0$. Suppose first $\alpha > \beta$. If $0 \leqq x \leqq 1$ and $P\lbrack\lim \sup (W_n + B_n)^{-1}W_n \leqq x\rbrack = 1$, by an easy variation of the Strong Law, with probability 1, in $N$ trials there will be at most $Nx + o(N)$ drawings of a white ball; so at least $N(1 - x) - o(N)$ drawings of black. Therefore, with probability 1, $\lim \sup (W_n + B_n)^{-1}B_n$ is bounded above by $\lim_{N\rightarrow\infty}\{\alpha\lbrack Nx + o(N)\rbrack + \beta\lbrack N(1 - x) - o(N)\rbrack\}/N(\alpha + \beta)$ or $(\alpha + \beta)^{-1}\lbrack\beta + (\alpha - \beta)x\rbrack$. Starting with $x = 1$ and iterating, $P\lbrack\lim \sup (W_n + B_n)^{-1} \leqq \frac{1}{2}\rbrack = 1$ follows. Interchange white and black to complete the proof for $\alpha > \beta$. If $\alpha < \beta$, and $P\lbrack\lim \sup (W_n + B_n)^{-1}W_n \leqq x\rbrack = 1$, then a similar argument shows $P\lbrack\lim \sup (W_n + B_n)^{-1}B_n \leqq (\alpha + \beta)^{-1}. (\alpha + (\beta - \alpha)x)\rbrack = 1$. The argument proceeds as before, except both colors must be considered simultaneously.

Citation

Download Citation

David A. Freedman. "Bernard Friedman's Urn." Ann. Math. Statist. 36 (3) 956 - 970, June, 1965. https://doi.org/10.1214/aoms/1177700068

Information

Published: June, 1965
First available in Project Euclid: 27 April 2007

zbMATH: 0138.12003
MathSciNet: MR177432
Digital Object Identifier: 10.1214/aoms/1177700068

Rights: Copyright © 1965 Institute of Mathematical Statistics

Vol.36 • No. 3 • June, 1965
Back to Top