Abstract
In this paper we consider the posterior consistency of Bayesian inference procedures when the family of models consists of appropriate stochastic processes. Specifically, we suppose that one observes an unknown ergodic process and one has access to a family of models consisting of dependent processes. In this context, we consider Gibbs posterior inference, which is a loss-based generalization of standard Bayesian inference. Our main results characterize the asymptotic behavior of the Gibbs posterior distributions on the space of models. Furthermore, we show that in the case of properly specified models our convergence results may be used to establish posterior consistency. Our model processes are defined via the thermodynamic formalism for dynamical systems, and they allow for a large degree of dependence, including both Markov chains of unbounded orders and processes that are not Markov of any order. This work establishes close connections between Gibbs posterior inference and the thermodynamic formalism for dynamical systems, which we hope will lead to new questions and results in both nonparametric Bayesian analysis and the thermodynamic formalism.
Funding Statement
KM acknowledges the support of National Science Foundation grants DMS-1613261 and DMS-1847144. SM Acknowledges support from National Science Foundation grants DEB-1840223, DMS 17-13012, and DMS 16-13261, as well as National Institutes of Health grant R01 DK116187-01 and Human Frontier Science Program grant RGP0051/2017. ABN acknowledges support from National Science Foundation grants DMS-1613261 and NSF DMS-1613072, as well as National Institutes of Health grant R01 HG009125-01.
Citation
Kevin McGoff. Sayan Mukherjee. Andrew B. Nobel. "Gibbs posterior convergence and the thermodynamic formalism." Ann. Appl. Probab. 32 (1) 461 - 496, February 2022. https://doi.org/10.1214/21-AAP1685
Information