Abstract
This paper proposes a new nonparametric Bayesian bootstrap for a mixture model, by suitably developing the traditional Bayesian bootstrap. We first reinterpret the Bayesian bootstrap, which uses the Pólya-urn scheme, as a stochastic gradient algorithm. The key then is to use the same basic mechanism as the Bayesian bootstrap with the switch from a point mass kernel to a continuous kernel. Just as the Bayesian bootstrap works solely from the empirical distribution function, so the new Bayesian bootstrap for mixture models works off the nonparametric maximum likelihood estimator for the mixing distribution. From a theoretical perspective, we prove the convergence and asymptotic exchangeability of the sample sequences from the algorithm and also illustrate with various models and settings and use real data.
Acknowledgments
The authors would like to thank the editor and reviewers for comments and suggestions which have enabled us to improve the paper substantially.
Citation
Fuheng Cui. Stephen G. Walker. "A Bayesian Bootstrap for Mixture Models." Bayesian Anal. Advance Publication 1 - 28, 2024. https://doi.org/10.1214/24-BA1498
Information