Open Access
2024 A Bayesian Bootstrap for Mixture Models
Fuheng Cui, Stephen G. Walker
Author Affiliations +
Bayesian Anal. Advance Publication 1-28 (2024). DOI: 10.1214/24-BA1498

Abstract

This paper proposes a new nonparametric Bayesian bootstrap for a mixture model, by suitably developing the traditional Bayesian bootstrap. We first reinterpret the Bayesian bootstrap, which uses the Pólya-urn scheme, as a stochastic gradient algorithm. The key then is to use the same basic mechanism as the Bayesian bootstrap with the switch from a point mass kernel to a continuous kernel. Just as the Bayesian bootstrap works solely from the empirical distribution function, so the new Bayesian bootstrap for mixture models works off the nonparametric maximum likelihood estimator for the mixing distribution. From a theoretical perspective, we prove the convergence and asymptotic exchangeability of the sample sequences from the algorithm and also illustrate with various models and settings and use real data.

Acknowledgments

The authors would like to thank the editor and reviewers for comments and suggestions which have enabled us to improve the paper substantially.

Citation

Download Citation

Fuheng Cui. Stephen G. Walker. "A Bayesian Bootstrap for Mixture Models." Bayesian Anal. Advance Publication 1 - 28, 2024. https://doi.org/10.1214/24-BA1498

Information

Published: 2024
First available in Project Euclid: 12 December 2024

arXiv: 2310.00880
Digital Object Identifier: 10.1214/24-BA1498

Subjects:
Primary: 62C10 , 62G09
Secondary: 62G20

Keywords: asymptotic exchangeability , Bayesian nonparametrics , score function , stochastic gradient algorithm

Rights: © 2024 International Society for Bayesian Analysis

Advance Publication
Back to Top