Consider a stationary $m$-dependent sequence of random indicator variables. If $m > 1$, assume further that any two nonzero values are separated by at least $m - 1$ zeros. This paper studies the sequence of the lengths of the successive intervals between the nonzero values of the original sequence, and it is shown that, provided a technical condition holds, these lengths converge in distribution (and their moments converge exponentially fast) in all cases but one.
"Runs in $m$-Dependent Sequences." Ann. Probab. 12 (3) 805 - 818, August, 1984. https://doi.org/10.1214/aop/1176993229