Open Access
Translator Disclaimer
May 2019 Bayesian consistency for a nonparametric stationary Markov model
Minwoo Chae, Stephen G. Walker
Bernoulli 25(2): 877-901 (May 2019). DOI: 10.3150/17-BEJ1007


We consider posterior consistency for a Markov model with a novel class of nonparametric prior. In this model, the transition density is parameterized via a mixing distribution function. Therefore, the Wasserstein distance between mixing measures can be used to construct neighborhoods of a transition density. The Wasserstein distance is sufficiently strong, for example, if the mixing distributions are compactly supported, it dominates the sup-$L_{1}$ metric. We provide sufficient conditions for posterior consistency with respect to the Wasserstein metric provided that the true transition density is also parametrized via a mixing distribution. In general, when it is not be parameterized by a mixing distribution, we show the posterior distribution is consistent with respect to the average $L_{1}$ metric. Also, we provide a prior whose support is sufficiently large to contain most smooth transition densities.


Download Citation

Minwoo Chae. Stephen G. Walker. "Bayesian consistency for a nonparametric stationary Markov model." Bernoulli 25 (2) 877 - 901, May 2019.


Received: 1 April 2016; Revised: 1 September 2017; Published: May 2019
First available in Project Euclid: 6 March 2019

zbMATH: 07049394
MathSciNet: MR3920360
Digital Object Identifier: 10.3150/17-BEJ1007

Keywords: Kullback–Leibler support , mixtures , nonparametric Markov model , posterior consistency , Wasserstein metric

Rights: Copyright © 2019 Bernoulli Society for Mathematical Statistics and Probability


Vol.25 • No. 2 • May 2019
Back to Top