Abstract
In the processes under consideration, an interval of length $L$ splits with probability (or exponential rate) proportional to $L^\alpha, \alpha \in \lbrack -\infty, \infty\rbrack$, and when it splits, it splits into two intervals of length $LV$ and $L(1 - V)$ where $V$ has d.f. $F$ on (0, 1). When $\alpha = 1$ and $F(x) = x$, the split points are i.i.d. uniform on (0, 1) and when $\alpha = \infty$ (a longest interval is always split), the model is a splitting process invented by Kakutani. In both these cases, the empirical distribution of the split points converges almost surely to the uniform distribution on (0, 1). On the other hand, when $\alpha = 0$, the model is a binary cascade and the empirical distribution of the split points converges almost surely to a random, continuous, singular distribution. In this paper, we show what happens in the other cases. Can the reader guess at what point the character of the limiting behavior changes?
Citation
Michael D. Brennan. Richard Durrett. "Splitting Intervals." Ann. Probab. 14 (3) 1024 - 1036, July, 1986. https://doi.org/10.1214/aop/1176992456
Information