The Annals of Statistics

Evaluation of formal posterior distributions via Markov chain arguments

Morris L. Eaton, James P. Hobert, Galin L. Jones, and Wen-Lin Lai

Full-text: Open access


We consider evaluation of proper posterior distributions obtained from improper prior distributions. Our context is estimating a bounded function φ of a parameter when the loss is quadratic. If the posterior mean of φ is admissible for all bounded φ, the posterior is strongly admissible. We give sufficient conditions for strong admissibility. These conditions involve the recurrence of a Markov chain associated with the estimation problem. We develop general sufficient conditions for recurrence of general state space Markov chains that are also of independent interest. Our main example concerns the p-dimensional multivariate normal distribution with mean vector θ when the prior distribution has the form g(‖θ2)  on the parameter space ℝp. Conditions on g for strong admissibility of the posterior are provided.

Article information

Ann. Statist., Volume 36, Number 5 (2008), 2423-2452.

First available in Project Euclid: 13 October 2008

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62C15: Admissibility
Secondary: 60J05: Discrete-time Markov processes on general state spaces

Admissibility formal Bayes improper prior distribution multivariate normal distribution recurrence superharmonic function


Eaton, Morris L.; Hobert, James P.; Jones, Galin L.; Lai, Wen-Lin. Evaluation of formal posterior distributions via Markov chain arguments. Ann. Statist. 36 (2008), no. 5, 2423--2452. doi:10.1214/07-AOS542.

Export citation


  • [1] Abramowitz, M. and Stegun, I. A. (1964). Handbook of Mathematical Functions. Dover, New York.
  • [2] Berger, J. O., Strawderman, W. and Tang, D. (2005). Posterior propriety and admissibility of hyperpriors in normal hierarchical models. Ann. Statist. 33 606–646.
  • [3] Billingsley, P. (1995). Probability and Measure, 3rd ed. Wiley, New York.
  • [4] Brown, L. D. (1971). Admissible estimators, recurrent diffusions, and insoluble boundary value problems. Ann. Math. Statist. 42 855–903.
  • [5] Chung, K. L. and Fuchs, W. H. (1951). On the distribution of values of sums of random variables. Mem. Amer. Math. Soc. 6 1–12.
  • [6] Diaconis, P. and Stroock, D. (1991). Geometric bounds for eigenvalues of Markov chains. Ann. Appl. Probab. 1 36–61.
  • [7] Eaton, M. L. (1982). A method for evaluating improper prior distributions. In Statistical Decision Theory and Related Topics III (S. S. Gupta and J. O. Berger, eds.) 1 329–325. Academic, New York.
  • [8] Eaton, M. L. (1992). A statistical diptych: Admissible inferences—recurrence of symmetric Markov chains. Ann. Statist. 20 1147–1179.
  • [9] Eaton, M. L. (2001). Markov chain conditions for admissibility in estimation problems with quadratic loss. In State of the Art in Probability and Statistics—A Festschrift for Willem R. van Zwet (M. de Gunst, C. Klaassen and A. van der Vaart, eds.) 223–243. IMS Lecture Notes Ser. 36. IMS, Beachwood, OH.
  • [10] Eaton, M. L. (2004). Evaluating improper priors and the recurrence of symmetric Markov chains: An overview. In A Festschrift to Honor Herman Rubin (A. Dasgupta, ed.). IMS Lecture Notes Ser. 45. IMS, Beachwood, OH.
  • [11] Eaton, M. L., Hobert, J. P. and Jones, G. L. (2007). On perturbations of strongly admissible prior distributions. Ann. Inst. H. Poincaré Probab. Statist. 43 633–653.
  • [12] Feller, W. (1966). An Introduction to Probability Theory and Its Applications. II. Wiley, New York.
  • [13] Fukushima, M., Oshima, Y. and Takeda, M. (1994). Dirichlet Forms and Symmetric Markov Processes. de Gruyter, Berlin.
  • [14] Hobert, J. P. and Robert, C. P. (1999). Eaton’s Markov chain, its conjugate partner and P-admissibility. Ann. Statist. 27 361–373.
  • [15] Kass, R. E. and Wasserman, L. (1996). The selection of prior distributions by formal rules. J. Amer. Statist. Assoc. 91 1343–1370.
  • [16] Lai, W.-L. (1996). Admissibility and the recurrence of Markov chains with applications. Ph.D. thesis, Univ. Minnesota.
  • [17] Lamperti, J. (1960). Criteria for the recurrence or transience of stochastic processes. I. J. Math. Anal. Appl. 1 314–330.
  • [18] Lehmann, E. L. (1986). Testing Statistical Hypotheses. Wiley, New York.
  • [19] Meyn, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer, London.
  • [20] Revuz, D. (1984). Markov Chains. North-Holland, Amsterdam.