The first time that the $N$ sites to the right of the origin become empty in a one-dimensional zero-range process is shown to converge exponentially fast, as $N \rightarrow \infty$, to the exponential distribution, when divided by its mean. The initial distribution of the process is assumed to be one of the extremal invariant measures $\nu_\rho, \rho \in (0, 1)$, with density $\rho/(1 - \rho)$. The proof is based on the classical Burke theorem.
"Exponential Waiting Time for a Big Gap in a One-Dimensional Zero-Range Process." Ann. Probab. 22 (1) 284 - 288, January, 1994. https://doi.org/10.1214/aop/1176988860