Brazilian Journal of Probability and Statistics

Comparing consensus Monte Carlo strategies for distributed Bayesian computation

Steven L. Scott

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


Consensus Monte Carlo is an algorithm for conducting Monte Carlo based Bayesian inference on large data sets distributed across many worker machines in a data center. The algorithm operates by running a separate Monte Carlo algorithm on each worker machine, which only sees a portion of the full data set. The worker-level posterior samples are then combined to form a Monte Carlo approximation to the full posterior distribution based on the complete data set. We compare several methods of carrying out the combination, including a new method based on approximating worker-level simulations using a mixture of multivariate Gaussian distributions. We find that resampling and kernel density based methods break down after 10 or sometimes fewer dimensions, while the new mixture-based approach works well, but the necessary mixture models take too long to fit.

Article information

Braz. J. Probab. Stat., Volume 31, Number 4 (2017), 668-685.

Received: December 2016
Accepted: April 2017
First available in Project Euclid: 15 December 2017

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Big data cloud computing Bayesian modeling


Scott, Steven L. Comparing consensus Monte Carlo strategies for distributed Bayesian computation. Braz. J. Probab. Stat. 31 (2017), no. 4, 668--685. doi:10.1214/17-BJPS365.

Export citation


  • Ahn, S., Shahbaba, B., Welling, M., et al. (2014). Distributed stochastic gradient MCMC. In ICML 1044–1052.
  • Bardenet, R., Doucet, A. and Holmes, C. (2014). Towards scaling up Markov chain Monte Carlo: An adaptive subsampling approach. In Proceedings of the 31st International Conference on Machine Learning (ICML-14) (T. Jebara and E. P. Xing, eds.) 405–413. JMLR Workshop and Conference Proceedings.
  • Blei, D. M., Jordan, M. I., et al. (2006). Variational inference for Dirichlet process mixtures. Bayesian Analysis 1, 121–144.
  • Chang, F., Dean, J., Ghemawat, S., Hsieh, W. C., Wallach, D. A., Burrows, M., Chandra, T., Fikes, A. and Gruber, R. E. (2008). Bigtable: A distributed storage system for structured data. ACM Transactions on Computer Systems 26, 4.
  • Chen, H., Seita, D., Pan, X. and Canny, J. (2016). An efficient minibatch acceptance test for Metropolis–Hastings. arXiv preprint, available at arXiv:1610.06848v1.
  • Dean, J. and Ghemawat, S. (2008). MapReduce: Simplified data processing on large clusters. Communications of the ACM 51, 107–113.
  • Doucet, A., De Frietas, N. and Gordon, N. (2001). Sequential Monte Carlo in Practice. Springer.
  • Lee, A., Yao, C., Giles, M. B., Doucet, A. and Holmes, C. C. (2010). On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Journal of Computational and Graphical Statistics 19, 769–789.
  • Maclaurin, D. and Adams, R. P. (2014). Firefly Monte Carlo: Exact MCMC with subsets of data. Available at arXiv:1403.5693.
  • McGrory, C. A. and Titterington, D. M. (2007). Variational approximations in Bayesian model selection for finite mixture distributions. Computational Statistics & Data Analysis 51, 5352–5367.
  • McLachlan, G. J. and Peel, D. (2000). Finite Mixture Models. New York: John Wiley & Sons.
  • McLachlan, G. J., Peel, D. and Bean, R. W. (2003). Modelling high-dimensional data by mixtures of factor analyzers. Computational Statistics & Data Analysis 41, 379–388.
  • Miroshnikov, A. and Conlon, E. (2014). parallelMCMCcombine: Methods for combining independent subset Markov chain Monte Carlo (MCMC) posterior samples to estimate a posterior density given the full data set. R package version 1.0. Available at
  • Neal, R. M. (2000). Markov chain sampling methods for Dirichlet process mixture models. Journal of Computational and Graphical Statistics 9, 249–265.
  • Neiswanger, W., Wang, C. and Xing, E. (2013). Asymptotically exact, embarrassingly parallel MCMC. arXiv preprint, available at arXiv:1311.4780.
  • Quiroz, M., Villani, M. and Kohn, R. (2016). Exact subsampling MCMC. Available at arXiv:1603.08232.
  • Rousseau, J. and Mengersen, K. (2011). Asymptotic behaviour of the posterior distribution in overfitted mixture models. Journal of the Royal Statistical Society, Series B, Statistical Methodology 73, 689–710.
  • Scott, D. W. and Sain, S. R. (2005). Multidimensional density estimation. In Handbook of Statistics 24, 229–261. Elsevier.
  • Scott, S. L. (2010). A modern Bayesian look at the multi-armed bandit. Applied Stochastic Models in Business and Industry 26, 639–658 (with discussion).
  • Scott, S. L. (2015). Multi-armed bandit experiments in the online service economy. Applied Stochastic Models in Business and Industry 31, 37–45.
  • Scott, S. L., Blocker, A. W., Bonassi, F. V., Chipman, H. A., George, E. I. and McCulloch, R. E. (2016). Bayes and big data: The consensus Monte Carlo algorithm. International Journal of Management Science and Engineering Management 11, 78–88.
  • Srivastava, S., Li, C. and Dunson, D. B. (2015). Scalable Bayes via barycenter in Wasserstein space. arXiv preprint, available at arXiv:1508.05880.
  • Suchard, M. A., Wang, Q., Chan, C., Frelinger, J., Cron, A. and West, M. (2010). Understanding GPU programming for statistical computation: Studies in massively parallel massive mixtures. Journal of Computational and Graphical Statistics 19, 419–438.
  • Tadesse, M. G., Sha, N. and Vannucci, M. (2005). Bayesian variable selection in clustering high-dimensional data. Journal of the American Statistical Association 100, 602–617.
  • Wang, X. and Dunson, D. B. (2013). Parallel MCMC via Weierstrass sampler. arXiv preprint, available at arXiv:1312.4605.
  • Wang, X., Fangjian, G., Heller, K. A. and Dunson, D. B. (2015). Parallelizing MCMC with random partition trees., DOI:10.13140/RG.2.1.2921.4883.

See also