Open Access
Translator Disclaimer
November 1997 Patterns of buffer overflow in a class of queues with long memory in the input stream
David Heath, Sidney Resnick, Gennady Samorodnitsky
Ann. Appl. Probab. 7(4): 1021-1057 (November 1997). DOI: 10.1214/aoap/1043862423


We study the time it takes until a fluid queue with a finite, but large, holding capacity reaches the overflow point. The queue is fed by an on/off process with a heavy tailed on distribution which is known to have long memory. It turns out that the expected time until overflow, as a function of capacity L, increases only polynomially fast; so overflows happen much more often than in the "classical" light tailed case, where the expected over-flow time increases as an exponential function of L. Moreover, we show that in the heavy tailed case overflows are basically caused by single huge jobs. An implication is that the usual $GI/G/1$ queue with finite but large holding capacity and heavy tailed service times will overflow about equally often no matter how much we increase the service rate. We also study the time until overflow for queues fed by a superposition of k iid on/off processes with a heavy tailed on distribution, and we show the benefit of pooling the system resources as far as time until overflow is concerned.


Download Citation

David Heath. Sidney Resnick. Gennady Samorodnitsky. "Patterns of buffer overflow in a class of queues with long memory in the input stream." Ann. Appl. Probab. 7 (4) 1021 - 1057, November 1997.


Published: November 1997
First available in Project Euclid: 29 January 2003

zbMATH: 0905.60070
MathSciNet: MR1484796
Digital Object Identifier: 10.1214/aoap/1043862423

Primary: 60K25 , 90B15

Keywords: $G/G/1$ queue , buffer overflow , fluid models , heavy tailed distribution , heavy tails , long memory , Long range dependence , maximum work load , on/off models , regular variation , time to hit a level , weak convergence

Rights: Copyright © 1997 Institute of Mathematical Statistics


Vol.7 • No. 4 • November 1997
Back to Top