Electronic Journal of Statistics

The explicit form of expectation propagation for a simple statistical model

Andy S. I. Kim and M. P. Wand

Full-text: Open access


We derive the explicit form of expectation propagation for approximate deterministic Bayesian inference in a simple statistical model. The model corresponds to a random sample from the Normal distribution. The explicit forms, and their derivation, allow a deeper understanding of the issues and challenges involved in practical implementation of expectation propagation for statistical analyses. No auxiliary approximations are used: we follow the expectation propagation prescription exactly. A simulation study shows expectation propagation to be more accurate than mean field variational Bayes for larger sample sizes, but at the cost of considerably more algebraic and computational effort.

Article information

Electron. J. Statist., Volume 10, Number 1 (2016), 550-581.

Received: December 2014
First available in Project Euclid: 4 March 2016

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62F15: Bayesian inference
Secondary: 62H12: Estimation

Bayesian computing factor graph hierarchical Bayesian models message passing algorithm quadrature variational message passing


Kim, Andy S. I.; Wand, M. P. The explicit form of expectation propagation for a simple statistical model. Electron. J. Statist. 10 (2016), no. 1, 550--581. doi:10.1214/16-EJS1114. https://projecteuclid.org/euclid.ejs/1457123506

Export citation


  • [1] Bakker, B. & Heskes, T. (2007), ‘Learning and approximate inference in dynamic hierarchical models’, Computational Statistics and Data Analysis, 52, 821–839.
  • [2] Bishop, C.M. (2006), Pattern Recognition and Machine Learning, Springer, New York.
  • [3] Frey, B.J. & MacKay, D.J.C. (1998). ‘A revolution: Belief propagation in graphs with cycles’, In Jordan, M.I., Kearns, M.J., &, Solla S.A. (eds.), Advances in Neural Information Processing Systems 10, pp. 479–485.
  • [4] Guo, B.-N. & Qi, F. (2013), ‘Refinements of lower bounds for polygamma functions’, Proceedings of the American Mathematical Society, 141, 1007–1015.
  • [5] Luts, J., Broderick, T. & Wand, M.P. (2014), ‘Real-time semiparametric regression’, Journal of Computational and Graphical Statistics, 23, 589–615.
  • [6] Maybeck, P.S. (1982), Stochastic Models, Estimation and Control, Academic Press, New York.
  • [7] Minka, T.P. (2001), ‘Expectation propagation for approximate Bayesian inference’. In Breese, J.S. & Koller, D. (eds), Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence, pp. 362–369. Morgan Kaufmann, Burlington, Massachusetts.
  • [8] Minka, T. (2005), ‘Divergence measures and message passing’, Microsoft Research Technical Report Series, MSR-TR-2005-173, 1–17.
  • [9] Minka, T. & Winn, J. (2008), ‘Gates: A graphical notation for mixture models’, Microsoft Research Technical Report Series, MSR-TR-2008-185, 1–16.
  • [10] Minka, T., Winn, J., Guiver, J. & Knowles, D. (2013), Infer.NET 2.5, http://research.microsoft.com/infernet.
  • [11] R Development Core Team (2015), ‘R: A language and environment for statistical computing’, R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0, http://www.R-project.org
  • [12] Smyth, G. (2013), ‘statmod 1.4. Statistical modeling’, http://cran.r-project.org
  • [13] Stan Development Team (2016), ‘rstan 2.9: R interface to Stan’, Version 2.9, http://mc-stan.org
  • [14] Wainwright, M.J. & Jordan, M.I. (2008), ‘Graphical models, exponential families, and variational inference’, Foundations and Trends in Machine Learning, 1, 1–305.
  • [15] Wand, M.P. & Jones, M.C. (1993), ‘Comparison of smoothing parameterizations in bivariate kernel density estimation’, Journal of the American Statistical Association, 88, 520–528.
  • [16] Wand M.P. & Ripley, B.D. (2010), ‘KernSmooth 2.23. Functions for kernel smoothing for Wand & Jones (1995)’, http://cran.r-project.org
  • [17] Wand, M.P., Ormerod, J.T., Padoan, S.A. & Frühwirth, R. (2011), ‘Mean field variational Bayes for elaborate distributions’, Bayesian Analysis, 6, 847–900.
  • [18] Winn, J. & Bishop, C.M. (2005), ‘Variational message passing’, Journal of Machine Learning Research, 6, 661–694.
  • [19] Zoeter, O. & Heskes, T. (2005), ‘Gaussian quadrature based expectation propagation’, In Ghahramani, Z. and Cowell, R. (eds.), Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics pp. 445–452. The Society for Artificial Intelligence and Statistics.