We obtain several quantitative bounds on the mixing properties of an “ideal” Hamiltonian Monte Carlo (HMC) Markov chain for a strongly log-concave target distribution π on . Our main result says that the HMC Markov chain generates a sample with Wasserstein error ϵ in roughly steps, where the condition number is the ratio of the maximum and minimum eigenvalues of the Hessian of . In particular, this mixing bound does not depend explicitly on the dimension d. These results significantly extend and improve previous quantitative bounds on the mixing of ideal HMC, and can be used to analyze more realistic HMC algorithms. The main ingredient of our argument is a proof that initially “parallel” Hamiltonian trajectories contract over much longer steps than would be predicted by previous heuristics based on the Jacobi manifold.
Oren Mangoubi was supported by a Canadian Statistical Sciences Institute (CANSSI) Postdoctoral Fellowship, and by an NSERC Discovery grant. Aaron Smith was supported by an NSERC Discovery grant.
We are grateful to Natesh Pillai and Alain Durmus for helpful discussions. Oren Mangoubi was supported by a Canadian Statistical Sciences Institute (CANSSI) Postdoctoral Fellowship, and by an NSERC Discovery grant. Aaron Smith was supported by an NSERC Discovery grant. We are grateful to the anonymous reviewers of an earlier version of this paper for their helpful comments and suggestions.
"Mixing of Hamiltonian Monte Carlo on strongly log-concave distributions: Continuous dynamics." Ann. Appl. Probab. 31 (5) 2019 - 2045, October 2021. https://doi.org/10.1214/20-AAP1640