Brazilian Journal of Probability and Statistics

Comment: Consensus Monte Carlo using expectation propagation

Andrew Gelman and Aki Vehtari

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Article information

Source
Braz. J. Probab. Stat., Volume 31, Number 4 (2017), 692-696.

Dates
Received: June 2017
Accepted: June 2017
First available in Project Euclid: 15 December 2017

Permanent link to this document
https://projecteuclid.org/euclid.bjps/1513328762

Digital Object Identifier
doi:10.1214/17-BJPS365A

Mathematical Reviews number (MathSciNet)
MR3738173

Zentralblatt MATH identifier
1385.65005

Citation

Gelman, Andrew; Vehtari, Aki. Comment: Consensus Monte Carlo using expectation propagation. Braz. J. Probab. Stat. 31 (2017), no. 4, 692--696. doi:10.1214/17-BJPS365A. https://projecteuclid.org/euclid.bjps/1513328762


Export citation

References

  • Ahn, S., Korattikara, A. and Welling, M. (2012). Bayesian posterior sampling via stochastic gradient Fisher scoring. In Proceedings of the 29th International Conference on Machine Learning.
  • Gelman, A. (2016). Explanations for that shocking 2% shift. Statistical modeling, causal inference, and social science blog, 9 Nov. Available at http://andrewgelman.com/2016/11/09/explanations-shocking-2-shift/.
  • Gelman, A., Vehtari, A., Jylanki, P., Robert, C., Chopin, N. and Cunningham, J. P. (2014). Expectation propagation as a way of life. Available at arXiv:1412.4869.
  • Gershman, S., Hoffman, M. and Blei, D. (2012). Nonparametric variational inference. In Proceedings of the 29th International Conference on Machine Learning.
  • Heskes, T., Opper, M., Wiegerinck, W., Winther, O. and Zoeter, O. (2005). Approximate inference techniques with expectation constraints. Journal of Statistical Mechanics: Theory and Experiment P11015.
  • Hoffman, M., Blei, D. M., Wang, C. and Paisley, J. (2013). Stochastic variational inference. Journal of Machine Learning Research 14, 1303–1347.
  • Huang, Z. and Gelman, A. (2005). Sampling for Bayesian computation with large datasets. Technical report, Department of Statistics, Columbia University.
  • Minka, T. (2001). Expectation propagation for approximate Bayesian inference. In Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence (J. Breese and D. Koller, eds.) 362–369.
  • Neiswanger, W., Wang, C. and Xing, E. (2013). Asymptotically exact, embarrassingly parallel MCMC. Available at arXiv:1311.4780.
  • Scott, S. L. (2017). Comparing consensus Monte Carlo strategies for distributed Bayesian computation. Brazilian Journal of Probability and Statistics. To appear.
  • Scott, S. L., Blocker, A. W., Bonassi, F. V., Chipman, H. A., George, E. I. and McCulloch, R. E. (2013). Bayes and big data: The consensus Monte Carlo algorithm. In Bayes 250. Available at http://research.google.com/pubs/pub41849.html.
  • Tresp, V. (2000). A Bayesian committee machine. Neural Computation 12, 2719–2741.
  • Wang, C. and Blei, D. M. (2013). Variational inference in nonconjugate models. Journal of Machine Learning Research 14, 899–925.
  • Wang, C., Chen, M. H., Schifano, E., Wu, J. and Yan, J. (2015a). Statistical methods and computing for big data. Available at arXiv:1502.07989.
  • Wang, W., Rothschild, D., Goel, S. and Gelman, A. (2015b). Forecasting elections with non-representative polls. International Journal of Forecasting 31, 980–991.
  • Wang, X. and Dunson, D. B. (2013). Parallelizing MCMC via Weierstrass sampler. arXiv preprint, available at arXiv:1312.4605.

See also

  • Main article: Comparing consensus Monte Carlo strategies for distributed Bayesian computation.