## The Annals of Probability

### A Liapounov bound for solutions of the Poisson equation

#### Abstract

In this paper we consider $\psi$-irreducible Markov processes evolving in discrete or continuous time on a general state space. We develop a Liapounov function criterion that permits one to obtain explicit bounds on the solution to the Poisson equation and, in particular, obtain conditions under which the solution is square integrable.

These results are applied to obtain sufficient conditions that guarantee the validity of a functional central limit theorem for the Markov process. As a second consequence of the bounds obtained, a perturbation theory for Markov processes is developed which gives conditions under which both the solution to the Poisson equation and the invariant probability for the process are continuous functions of its transition kernel. The techniques are illustrated with applications to queueing theory and autoregressive processes.

#### Article information

Source
Ann. Probab., Volume 24, Number 2 (1996), 916-931.

Dates
First available in Project Euclid: 11 December 2002

https://projecteuclid.org/euclid.aop/1039639370

Digital Object Identifier
doi:10.1214/aop/1039639370

Mathematical Reviews number (MathSciNet)
MR1404536

Zentralblatt MATH identifier
0863.60063

#### Citation

Glynn, Peter W.; Meyn, Sean P. A Liapounov bound for solutions of the Poisson equation. Ann. Probab. 24 (1996), no. 2, 916--931. doi:10.1214/aop/1039639370. https://projecteuclid.org/euclid.aop/1039639370

#### References

• [1] Aldous, D. J. and Thorisson, H. (1993). Shift-coupling. Stochastic Process Appl. 44 1-4.
• [2] Asmussen, S. (1987). Applied Probability and Queues. Wiley, New York.
• [3] Athrey a, K. B. and Pantula, S. G. (1986). Mixing properties of Harris chains and autoregressive processes. J. Appl. Probab. 23 880-892.
• [4] Benveniste, A., Metivier, M. and Priouret, P. (1990). Adaptive Algorithms and Stochastic Approximations. Applications of Mathematics 22. Springer, Berlin. (Translated from the French by Stephen S. Wilson.)
• [5] Bhattacharay a, R. N. (1982). On the functional central limit theorem and the law of the iterated logarithm for Markov processes. Z. Wahrsch. Verw. Gebiete 60 185-201.
• [6] Caines, P. (1988). Linear Stochastic Sy stems. Wiley, New York.
• [7] Constantinescu, C. and Cornea, A. (1972). Potential Theory on Harmonic Spaces. Springer, Berlin.
• [8] Davis, M. H. A. (1993). Markov Models and Optimization. Chapman and Hall, London.
• [9] Down, D., Mey n, S. P. and Tweedie, R. L. (1993). Geometric and uniform ergodicity of Markov processes. Ann. Probab. To appear.
• [10] Duflo, M. (1990). M´ethodes R´ecursives Al´eatoires. Masson, Paris.
• [11] Foguel, S. R. (1969). The Ergodic Theory of Markov Processes. Van Nostrand Reinhold, New York.
• [12] Gly nn, P. W. (1994). Poisson's equation for the recurrent M/G/1 queue. Adv. in Appl. Probab. 26 1044-1062.
• [13] Hall, P. and Hey de, C. C. (1980). Martingale Limit Theory and Its Application. Academic, New York.
• [14] Kumar, P. R. and Mey n, S. P. (1995). Stability of queueing networks and scheduling policies IEEE Trans. Automat. Control 40 251-260.
• [15] Kurtz, T. G. (1981). The central limit theorem for Markov chains. Ann. Probab. 9 557-560.
• [16] Kushner, H. J. (1967). Stochastic Stability and Control. Academic, New York.
• [17] Maigret, N. (1978). Th´eor eme de limite centrale pour une cha ine de Markov r´ecurrente Harris positive. Ann. Inst. H. Poincar´e Probab. Statist. 14 425-440.
• [18] Makowski, A. and Shwartz, A. (1992). Stochastic approximations and adaptive control of a discrete-time single-server network with random routing. SIAM J. Control Optim. 30.
• [19] Metivier, M. and Priouret, P. (1987). Th´eor emes de convergence presque sure pour une classe d'algorithmes stochastiques a pas d´ecroissants. Probab. Theory Related Fields 74 403-428.
• [20] Mey n, S. P. and Down, D. (1994). Stability of generalized Jackson networks. Ann. Appl. Probab. 4 124-148.
• [21] Mey n, S. P. and Guo, L. (1992). Stability, convergence, and performance of an adaptive control algorithm applied to a randomly varying sy stem. IEEE Trans. Automat. Control 37 535-540.
• [22] Mey n, S. P. and Guo, L. (1993). Geometric ergodicity of a bilinear time series model. J. Time Ser. Anal. 14 93-108.
• [23] Mey n, S. P. and Tweedie, R. L. (1992). Stability of Markovian processes I. Discrete time chains. Adv. in Appl. Probab. 24 542-574.
• [24] Mey n, S. P. and Tweedie, R. L. (1993). Generalized resolvents and Harris recurrence of Markov processes. Contemp. Math. 149 227-250.
• [25] Mey n, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer, London.
• [26] Mey n, S. P. and Tweedie, R. L. (1993). Stability of Markovian processes II. Continuous time processes and sampled chains. Adv. in Appl. Probab. 25 487-517.
• [27] Mey n, S. P. and Tweedie, R. L. (1993). Stability of Markovian processes III. Foster- Ly apunov criteria for continuous time processes. Adv. in Appl. Probab. 25 518-548.
• [28] Neveu, J. (1972). Potentiel Markovien r´ecurrent des cha ines de Harris. Ann. Inst. Fourier (Grenoble) 22 7-130.
• [29] Nummelin, E. (1984). General Irreducible Markov Chains and Non-Negative Operators. Cambridge Univ. Press.
• [30] Nummelin, E. (1991). On the Poisson equation in the potential theory of a single kernel. Math. Scand. 68 59-82.
• [31] Nummelin, E. and Tuominen, P. (1982). Geometric ergodicity of Harris recurrent Markov chains with applications to renewal theory. Stochastic Process Appl. 12 187-202.
• [32] Revuz, D. (1984). Markov Chains, 2nd ed. North-Holland, Amsterdam.
• [33] Ross, S. M. (1984). Introduction to Stochastic Dy namic Programming. Academic, New York.
• [34] Schweitzer, P. J. (1968). Perturbation theory and finite Markov chains. J. Appl. Probab. 5 401-403.
• [35] Sharpe, M. (1988). General Theory of Markov Processes. Academic, New York.
• [36] Shwartz, A. and Makowski, A. (1991). On the Poisson equation for Markov chains: existence of solutions and parameter dependence. Technical report, Dept. Electrical Engineering, Technion-Israel Institute of Technology.