Open Access
October 2016 Discrepancy bounds for uniformly ergodic Markov chain quasi-Monte Carlo
Josef Dick, Daniel Rudolf, Houying Zhu
Ann. Appl. Probab. 26(5): 3178-3205 (October 2016). DOI: 10.1214/16-AAP1173

Abstract

Markov chains can be used to generate samples whose distribution approximates a given target distribution. The quality of the samples of such Markov chains can be measured by the discrepancy between the empirical distribution of the samples and the target distribution. We prove upper bounds on this discrepancy under the assumption that the Markov chain is uniformly ergodic and the driver sequence is deterministic rather than independent $U(0,1)$ random variables. In particular, we show the existence of driver sequences for which the discrepancy of the Markov chain from the target distribution with respect to certain test sets converges with (almost) the usual Monte Carlo rate of $n^{-1/2}$.

Citation

Download Citation

Josef Dick. Daniel Rudolf. Houying Zhu. "Discrepancy bounds for uniformly ergodic Markov chain quasi-Monte Carlo." Ann. Appl. Probab. 26 (5) 3178 - 3205, October 2016. https://doi.org/10.1214/16-AAP1173

Information

Received: 1 March 2013; Revised: 1 February 2015; Published: October 2016
First available in Project Euclid: 19 October 2016

zbMATH: 1351.60100
MathSciNet: MR3563205
Digital Object Identifier: 10.1214/16-AAP1173

Subjects:
Primary: 60J22 , 62F15 , 65C40
Secondary: 60J05 , 65C05

Keywords: discrepancy theory , Markov chain Monte Carlo , probabilistic method , uniformly ergodic Markov chain

Rights: Copyright © 2016 Institute of Mathematical Statistics

Vol.26 • No. 5 • October 2016
Back to Top