Translator Disclaimer
2009 A Bernstein-Von Mises Theorem for discrete probability distributions
S. Boucheron, E. Gassiat
Electron. J. Statist. 3: 114-148 (2009). DOI: 10.1214/08-EJS262

Abstract

We investigate the asymptotic normality of the posterior distribution in the discrete setting, when model dimension increases with sample size. We consider a probability mass function θ0 on ℕ{0} and a sequence of truncation levels (kn)n satisfying kn3ninf iknθ0(i). Let θ̂ denote the maximum likelihood estimate of (θ0(i))ikn and let Δn(θ0) denote the kn-dimensional vector which i-th coordinate is defined by $\sqrt{n}(\hat{\theta}_{n}(i)-\theta_{0}(i))$ for 1ikn. We check that under mild conditions on θ0 and on the sequence of prior probabilities on the kn-dimensional simplices, after centering and rescaling, the variation distance between the posterior distribution recentered around θ̂n and rescaled by $\sqrt{n}$ and the kn-dimensional Gaussian distribution $\mathcal{N}(\Delta_{n}(\theta_{0}),I^{-1}(\theta_{0}))$ converges in probability to 0. This theorem can be used to prove the asymptotic normality of Bayesian estimators of Shannon and Rényi entropies.

The proofs are based on concentration inequalities for centered and non-centered Chi-square (Pearson) statistics. The latter allow to establish posterior concentration rates with respect to Fisher distance rather than with respect to the Hellinger distance as it is commonplace in non-parametric Bayesian statistics.

Citation

Download Citation

S. Boucheron. E. Gassiat. "A Bernstein-Von Mises Theorem for discrete probability distributions." Electron. J. Statist. 3 114 - 148, 2009. https://doi.org/10.1214/08-EJS262

Information

Published: 2009
First available in Project Euclid: 28 January 2009

zbMATH: 1326.62036
MathSciNet: MR2471588
Digital Object Identifier: 10.1214/08-EJS262

Subjects:
Primary: 60K35, 60K35
Secondary: 60K35

Rights: Copyright © 2009 The Institute of Mathematical Statistics and the Bernoulli Society

JOURNAL ARTICLE
35 PAGES


SHARE
Back to Top