Abstract
Consider a family of distributions $\{\pi_{\beta}\}$ where $X\sim\pi_{\beta}$ means that $\mathbb{P}(X=x)=\exp(-\beta H(x))/Z(\beta)$. Here $Z(\beta)$ is the proper normalizing constant, equal to $\sum_{x}\exp(-\beta H(x))$. Then $\{\pi_{\beta}\}$ is known as a Gibbs distribution, and $Z(\beta)$ is the partition function. This work presents a new method for approximating the partition function to a specified level of relative accuracy using only a number of samples, that is, $O(\ln(Z(\beta))\ln(\ln(Z(\beta))))$ when $Z(0)\geq1$. This is a sharp improvement over previous, similar approaches that used a much more complicated algorithm, requiring $O(\ln(Z(\beta))\ln(\ln(Z(\beta)))^{5})$ samples.
Citation
Mark Huber. "Approximation algorithms for the normalizing constant of Gibbs distributions." Ann. Appl. Probab. 25 (2) 974 - 985, April 2015. https://doi.org/10.1214/14-AAP1015
Information