• Bernoulli
  • Volume 24, Number 4B (2018), 3181-3221.

Posteriors, conjugacy, and exponential families for completely random measures

Tamara Broderick, Ashia C. Wilson, and Michael I. Jordan

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


We demonstrate how to calculate posteriors for general Bayesian nonparametric priors and likelihoods based on completely random measures (CRMs). We further show how to represent Bayesian nonparametric priors as a sequence of finite draws using a size-biasing approach – and how to represent full Bayesian nonparametric models via finite marginals. Motivated by conjugate priors based on exponential family representations of likelihoods, we introduce a notion of exponential families for CRMs, which we call exponential CRMs. This construction allows us to specify automatic Bayesian nonparametric conjugate priors for exponential CRM likelihoods. We demonstrate that our exponential CRMs allow particularly straightforward recipes for size-biased and marginal representations of Bayesian nonparametric models. Along the way, we prove that the gamma process is a conjugate prior for the Poisson likelihood process and the beta prime process is a conjugate prior for a process we call the odds Bernoulli process. We deliver a size-biased representation of the gamma process and a marginal representation of the gamma process coupled with a Poisson likelihood process.

Article information

Bernoulli, Volume 24, Number 4B (2018), 3181-3221.

Received: October 2014
Revised: May 2016
First available in Project Euclid: 18 April 2018

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Bayesian nonparametrics beta process completely random measure conjugacy exponential family Indian buffet process posterior size-biased


Broderick, Tamara; Wilson, Ashia C.; Jordan, Michael I. Posteriors, conjugacy, and exponential families for completely random measures. Bernoulli 24 (2018), no. 4B, 3181--3221. doi:10.3150/16-BEJ855.

Export citation


  • [1] Airoldi, E.M., Blei, D., Erosheva, E.A. and Fienberg, S.E., eds. (2015). Handbook of Mixed Membership Models and Their Applications. Chapman & Hall/CRC Handbooks of Modern Statistical Methods. Boca Raton, FL: CRC Press.
  • [2] Blei, D.M., Ng, A.Y. and Jordan, M.I. (2003). Latent Dirichlet allocation. J. Mach. Learn. Res. 3 993–1022.
  • [3] Broderick, T., Jordan, M.I. and Pitman, J. (2012). Beta processes, stick-breaking and power laws. Bayesian Anal. 7 439–475.
  • [4] Broderick, T., Jordan, M.I. and Pitman, J. (2013). Cluster and feature modeling from combinatorial stochastic processes. Statist. Sci. 28 289–312.
  • [5] Broderick, T., Mackey, L., Paisley, J. and Jordan, M.I. (2015). Combinatorial clustering and the beta negative binomial process. IEEE Trans. Pattern Anal. Mach. Intell. 37 290–306.
  • [6] Damien, P., Wakefield, J. and Walker, S. (1999). Gibbs sampling for Bayesian non-conjugate and hierarchical models by using auxiliary variables. J. R. Stat. Soc. Ser. B. Stat. Methodol. 61 331–344.
  • [7] DeGroot, M.H. (1970). Optimal Statistical Decisions. New York: Wiley.
  • [8] Diaconis, P. and Ylvisaker, D. (1979). Conjugate priors for exponential families. Ann. Statist. 7 269–281.
  • [9] Doksum, K. (1974). Tailfree and neutral random probabilities and their posterior distributions. Ann. Probab. 2 183–201.
  • [10] Doshi, F., Miller, K.T., Van Gael, J. and Teh, Y.W. (2009). Variational inference for the Indian buffet process. In AISTATS 137–144.
  • [11] Escobar, M.D. (1994). Estimating normal means with a Dirichlet process prior. J. Amer. Statist. Assoc. 89 268–277.
  • [12] Escobar, M.D. and West, M. (1995). Bayesian density estimation and inference using mixtures. J. Amer. Statist. Assoc. 90 577–588.
  • [13] Escobar, M.D. and West, M. (1998). Computing nonparametric hierarchical models. In Practical Nonparametric and Semiparametric Bayesian Statistics. Lecture Notes in Statist. 133 1–22. New York: Springer.
  • [14] Ferguson, T.S. (1973). A Bayesian analysis of some nonparametric problems. Ann. Statist. 1 209–230.
  • [15] Ferguson, T.S. (1974). Prior distributions on spaces of probability measures. Ann. Statist. 2 615–629.
  • [16] Griffiths, T. and Ghahramani, Z. (2006). Infinite latent feature models and the Indian buffet process. In NIPS.
  • [17] Hjort, N.L. (1990). Nonparametric Bayes estimators based on beta processes in models for life history data. Ann. Statist. 18 1259–1294.
  • [18] Ishwaran, H. and James, L.F. (2001). Gibbs sampling methods for stick-breaking priors. J. Amer. Statist. Assoc. 96 161–173.
  • [19] James, L.F. (2014). Poisson latent feature calculus for generalized Indian buffet processes. Preprint. Available at arXiv:1411.2936.
  • [20] James, L.F., Lijoi, A. and Prünster, I. (2009). Posterior analysis for normalized random measures with independent increments. Scand. J. Stat. 36 76–97.
  • [21] Kalli, M., Griffin, J.E. and Walker, S.G. (2011). Slice sampling mixture models. Stat. Comput. 21 93–105.
  • [22] Kim, Y. (1999). Nonparametric Bayesian estimators for counting processes. Ann. Statist. 27 562–588.
  • [23] Kingman, J.F.C. (1967). Completely random measures. Pacific J. Math. 21 59–78.
  • [24] Kingman, J.F.C. (1992). Poisson Processes 3. Oxford: Oxford Univ. Press.
  • [25] Lijoi, A. and Prünster, I. (2010). Models beyond the Dirichlet process. In Bayesian Nonparametrics (N.L. Hjort, C. Holmes, P. Müller and S.G. Walker, eds.). Cambridge Series in Statistical and Probabilistic Mathematics 28. Cambridge: Cambridge Univ. Press.
  • [26] Lo, A.Y. (1982). Bayesian nonparametric statistical inference for Poisson point processes. Z. Wahrsch. Verw. Gebiete 59 55–66.
  • [27] Lo, A.Y. (1984). On a class of Bayesian nonparametric estimates. I. Density estimates. Ann. Statist. 12 351–357.
  • [28] MacEachern, S.N. (1994). Estimating normal means with a conjugate style Dirichlet process prior. Comm. Statist. Simulation Comput. 23 727–741.
  • [29] Neal, R.M. (2000). Markov chain sampling methods for Dirichlet process mixture models. J. Comput. Graph. Statist. 9 249–265.
  • [30] Neal, R.M. (2003). Slice sampling. Ann. Statist. 31 705–767. With discussions and a rejoinder by the author.
  • [31] Orbanz, P. (2010). Conjugate projective limits. Preprint. Available at arXiv:1012.0363.
  • [32] Paisley, J.W., Blei, D.M. and Jordan, M.I. (2012). Stick-breaking beta processes and the Poisson process. In AISTATS 850–858.
  • [33] Paisley, J.W., Carin, L. and Blei, D.M. (2011). Variational inference for stick-breaking beta process priors. In ICML 889–896.
  • [34] Paisley, J.W., Zaas, A.K., Woods, C.W., Ginsburg, G.S. and Carin, L. (2010). A stick-breaking construction of the beta process. In ICML 847–854.
  • [35] Perman, M., Pitman, J. and Yor, M. (1992). Size-biased sampling of Poisson point processes and excursions. Probab. Theory Related Fields 92 21–39.
  • [36] Pitman, J. (1996). Random discrete distributions invariant under size-biased permutation. Adv. in Appl. Probab. 28 525–539.
  • [37] Pitman, J. (1996). Some developments of the Blackwell–MacQueen urn scheme. In Statistics, Probability and Game Theory. Institute of Mathematical Statistics Lecture Notes – Monograph Series 30 245–267. Hayward, CA: IMS.
  • [38] Pitman, J. (2003). Poisson–Kingman partitions. In Statistics and Science: A Festschrift for Terry Speed. Institute of Mathematical Statistics Lecture Notes – Monograph Series 40 1–34. Beachwood, OH: IMS.
  • [39] Sethuraman, J. (1994). A constructive definition of Dirichlet priors. Statist. Sinica 4 639–650.
  • [40] Teh, Y.W. and Görür, D. (2009). Indian buffet processes with power-law behavior. In NIPS 1838–1846.
  • [41] Teh, Y.W., Görür, D. and Ghahramani, Z. (2007). Stick-breaking construction for the Indian buffet process. In AISTATS 556–563.
  • [42] Teh, Y.W., Jordan, M.I., Beal, M.J. and Blei, D.M. (2006). Hierarchical Dirichlet processes. J. Amer. Statist. Assoc. 101 1566–1581.
  • [43] Thibaux, R. and Jordan, M.I. (2007). Hierarchical beta processes and the Indian buffet process. In AISTATS 564–571.
  • [44] Thibaux, R.J. (2008). Nonparametric Bayesian Models for Machine Learning Ph.D. thesis, UC Berkeley.
  • [45] Titsias, M.K. (2008). The infinite gamma-Poisson feature model. In NIPS 1513–1520.
  • [46] Walker, S.G. (2007). Sampling the Dirichlet mixture model with slices. Comm. Statist. Simulation Comput. 36 45–54.
  • [47] Wang, C. and Blei, D.M. (2013). Variational inference in nonconjugate models. J. Mach. Learn. Res. 14 1005–1031.
  • [48] West, M. and Escobar, M.D. (1994). Hierarchical priors and mixture models, with application in regression and density estimation. In Aspects of Uncertainty: A Tribute to D. V. Lindley (P.R. Freeman and A.F.M. Smith, eds.). Institute of Statistics and Decision Sciences, Duke Univ.
  • [49] Zhou, M., Hannah, L., Dunson, D. and Carin, L. (2012). Beta-negative binomial process and Poisson factor analysis. In AISTATS.