Open Access
February 2007 Convergence of adaptive mixtures of importance sampling schemes
R. Douc, A. Guillin, J.-M. Marin, C. P. Robert
Ann. Statist. 35(1): 420-448 (February 2007). DOI: 10.1214/009053606000001154

Abstract

In the design of efficient simulation algorithms, one is often beset with a poor choice of proposal distributions. Although the performance of a given simulation kernel can clarify a posteriori how adequate this kernel is for the problem at hand, a permanent on-line modification of kernels causes concerns about the validity of the resulting algorithm. While the issue is most often intractable for MCMC algorithms, the equivalent version for importance sampling algorithms can be validated quite precisely. We derive sufficient convergence conditions for adaptive mixtures of population Monte Carlo algorithms and show that Rao–Blackwellized versions asymptotically achieve an optimum in terms of a Kullback divergence criterion, while more rudimentary versions do not benefit from repeated updating.

Citation

Download Citation

R. Douc. A. Guillin. J.-M. Marin. C. P. Robert. "Convergence of adaptive mixtures of importance sampling schemes." Ann. Statist. 35 (1) 420 - 448, February 2007. https://doi.org/10.1214/009053606000001154

Information

Published: February 2007
First available in Project Euclid: 6 June 2007

zbMATH: 1132.60022
MathSciNet: MR2332281
Digital Object Identifier: 10.1214/009053606000001154

Subjects:
Primary: 60F05 , 62L12 , 65-04 , 65C05 , 65C40 , 65C60

Keywords: Bayesian statistics , Kullback divergence , LLN , MCMC algorithm , population Monte Carlo , proposal distribution , Rao–Blackwellization

Rights: Copyright © 2007 Institute of Mathematical Statistics

Vol.35 • No. 1 • February 2007
Back to Top