The Annals of Applied Probability

Approximation algorithms for the normalizing constant of Gibbs distributions

Mark Huber

Full-text: Open access

Abstract

Consider a family of distributions $\{\pi_{\beta}\}$ where $X\sim\pi_{\beta}$ means that $\mathbb{P}(X=x)=\exp(-\beta H(x))/Z(\beta)$. Here $Z(\beta)$ is the proper normalizing constant, equal to $\sum_{x}\exp(-\beta H(x))$. Then $\{\pi_{\beta}\}$ is known as a Gibbs distribution, and $Z(\beta)$ is the partition function. This work presents a new method for approximating the partition function to a specified level of relative accuracy using only a number of samples, that is, $O(\ln(Z(\beta))\ln(\ln(Z(\beta))))$ when $Z(0)\geq1$. This is a sharp improvement over previous, similar approaches that used a much more complicated algorithm, requiring $O(\ln(Z(\beta))\ln(\ln(Z(\beta)))^{5})$ samples.

Article information

Source
Ann. Appl. Probab., Volume 25, Number 2 (2015), 974-985.

Dates
First available in Project Euclid: 19 February 2015

Permanent link to this document
https://projecteuclid.org/euclid.aoap/1424355135

Digital Object Identifier
doi:10.1214/14-AAP1015

Mathematical Reviews number (MathSciNet)
MR3313760

Zentralblatt MATH identifier
1328.65011

Subjects
Primary: 68Q87: Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) [See also 68W20, 68W40] 65C60: Computational problems in statistics
Secondary: 65C05: Monte Carlo methods

Keywords
Integration Monte Carlo methods cooling schedule self-reducible

Citation

Huber, Mark. Approximation algorithms for the normalizing constant of Gibbs distributions. Ann. Appl. Probab. 25 (2015), no. 2, 974--985. doi:10.1214/14-AAP1015. https://projecteuclid.org/euclid.aoap/1424355135


Export citation

References

  • [1] Bezáková, I., Štefankovič, D., Vazirani, V. V. and Vigoda, E. (2008). Accelerating simulated annealing for the permanent and combinatorial counting problems. SIAM J. Comput. 37 1429–1454.
  • [2] Brooks, S., Gelman, A., Jones, G. and Meng, X., eds. (2011). Handbook of Markov Chain Monte Carlo. CRC Press, Boca Raton, FL.
  • [3] Dyer, M. and Frieze, A. (1991). Computing the volume of convex bodies: A case where randomness provably helps. In Probabilistic Combinatorics and Its Applications (San Francisco, CA, 1991) (B. Bollobás, ed.). Proc. Sympos. Appl. Math. 44 123–169. Amer. Math. Soc., Providence, RI.
  • [4] Fill, J. A. and Huber, M. L. (2010). Perfect simulation of Vervaat perpetuities. Electron. J. Probab. 15 96–109.
  • [5] Fishman, G. S. (1994). Choosing sample path length and number of sample paths when starting in the steady state. Oper. Res. Lett. 16 209–219.
  • [6] Huber, M. (2004). Perfect sampling using bounding chains. Ann. Appl. Probab. 14 734–753.
  • [7] Huber, M. L. and Schott, S. (2010). Using TPA for Bayesian inference (with discussions). Bayesian Stat. 9 257–282.
  • [8] Jerrum, M. and Sinclair, A. (1993). Polynomial-time approximation algorithms for the Ising model. SIAM J. Comput. 22 1087–1116.
  • [9] Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H. and Teller, E. (1953). Equation of state calculation by fast computing machines. J. Chem. Phys. 21 1087–1092.
  • [10] Propp, J. G. and Wilson, D. B. (1996). Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures Algorithms 9 223–252.
  • [11] Resnick, S. (1992). Adventures in Stochastic Processes. Birkhäuser, Boston, MA.
  • [12] Štefankovič, D., Vempala, S. and Vigoda, E. (2009). Adaptive simulated annealing: A near-optimal connection between sampling and counting. J. ACM 56 Art. 18, 36.
  • [13] Swendsen, R. H. and Wang, J.-S. (1986). Replica Monte Carlo simulation of spin-glasses. Phys. Rev. Lett. 57 2607–2609.
  • [14] Valleau, J. P. and Card, D. N. (1972). Monte Carlo estimation of the free energy by multistage sampling. J. Chem. Phys. 57 5457–5462.