The Annals of Applied Probability

Robust adaptive importance sampling for normal random vectors

Benjamin Jourdain and Jérôme Lelong

Full-text: Open access


Adaptive Monte Carlo methods are very efficient techniques designed to tune simulation estimators on-line. In this work, we present an alternative to stochastic approximation to tune the optimal change of measure in the context of importance sampling for normal random vectors. Unlike stochastic approximation, which requires very fine tuning in practice, we propose to use sample average approximation and deterministic optimization techniques to devise a robust and fully automatic variance reduction methodology. The same samples are used in the sample optimization of the importance sampling parameter and in the Monte Carlo computation of the expectation of interest with the optimal measure computed in the previous step. We prove that this highly dependent Monte Carlo estimator is convergent and satisfies a central limit theorem with the optimal limiting variance. Numerical experiments confirm the performance of this estimator: in comparison with the crude Monte Carlo method, the computation time needed to achieve a given precision is divided by a factor between 3 and 15.

Article information

Ann. Appl. Probab., Volume 19, Number 5 (2009), 1687-1718.

First available in Project Euclid: 16 October 2009

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 60F05: Central limit and other weak theorems 62L20: Stochastic approximation 65C05: Monte Carlo methods 90C15: Stochastic programming

Adaptive importance sampling central limit theorem sample averaging


Jourdain, Benjamin; Lelong, Jérôme. Robust adaptive importance sampling for normal random vectors. Ann. Appl. Probab. 19 (2009), no. 5, 1687--1718. doi:10.1214/09-AAP595.

Export citation


  • [1] Arouna, B. (2004). Adaptative Monte Carlo method, a variance reduction technique. Monte Carlo Methods Appl. 10 1–24.
  • [2] Arouna, B. (Winter 2003/04). Robbins Monro algorithms and variance reduction in finance. J. of Computational Finance 7 2.
  • [3] Chen, H. F. and Zhu, Y. M. (1986). Stochastic approximation procedures with randomly varying truncations. Sci. Sinica Ser. A 29 914–926.
  • [4] Chen, H. F., Lei, G. and Gao, A. J. (1988). Convergence and robustness of the Robbins–Monro algorithm truncated at randomly varying bounds. Stochastic Process. Appl. 27 217–231.
  • [5] Etoré, P., Fort, G., Jourdain, B. and Moulines, E. (2008). On adaptive stratification. Technical Report, hal-00319157.
  • [6] Etoré, P. and Jourdain, B. (2008). Adaptive optimal allocation in stratified sampling methods. Methodol. Comput. Appl. Probab. To appear.
  • [7] Glasserman, P. (2004). Monte Carlo Methods in Financial Engineering. Applications of Mathematics (New York). Stochastic Modelling and Applied Probability 53. Springer, New York.
  • [8] Glasserman, P., Heidelberger, P. and Shahabuddin, P. (1999). Asymptotically optimal importance sampling and stratification for pricing path-dependent options. Math. Finance 9 117–152.
  • [9] Kim, S. and Henderson, S. G. (2007). Adaptive control variates for finite-horizon simulation. Math. Oper. Res. 32 508–527.
  • [10] Ledoux, M. and Talagrand, M. (1991). Probability in Banach Spaces: Isoperimetry and Processes. Ergebnisse der Mathematik und Ihrer Grenzgebiete (3) [Results in Mathematics and Related Areas (3)] 23. Springer, Berlin.
  • [11] Lelong, J. (2007). Asymptotic properties of stochastic algorithms and pricing of Parisian options. Ph.D. thesis, Ecole Nationale des Ponts et Chaussées.
  • [12] Lelong, J. (2008). Almost sure convergence of randomly truncated stochastic algorithms under verifiable conditions. Statist. Probab. Lett. 78 16.
  • [13] Lemaire, V. and Pagès, G. (2008). Unconstrained recursive importance sampling. Technical report. Available at arXiv 0807.0762.
  • [14] Rubinstein, R. Y. and Shapiro, A. (1993). Discrete Event Systems: Sensitivity Analysis and Stochastic Optimization by the Score Function Method. Wiley, Chichester.