The Annals of Statistics

On nonnegative unbiased estimators

Pierre E. Jacob and Alexandre H. Thiery

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

We study the existence of algorithms generating almost surely nonnegative unbiased estimators. We show that given a nonconstant real-valued function $f$ and a sequence of unbiased estimators of $\lambda\in\mathbb{R}$, there is no algorithm yielding almost surely nonnegative unbiased estimators of $f(\lambda)\in\mathbb{R}^{+}$. The study is motivated by pseudo-marginal Monte Carlo algorithms that rely on such nonnegative unbiased estimators. These methods allow “exact inference” in intractable models, in the sense that integrals with respect to a target distribution can be estimated without any systematic error, even though the associated probability density function cannot be evaluated pointwise. We discuss the consequences of our results on the applicability of pseudo-marginal algorithms and thus on the possibility of exact inference in intractable models. We illustrate our study with particular choices of functions $f$ corresponding to known challenges in statistics, such as exact simulation of diffusions, inference in large datasets and doubly intractable distributions.

Article information

Source
Ann. Statist. Volume 43, Number 2 (2015), 769-784.

Dates
First available in Project Euclid: 3 March 2015

Permanent link to this document
https://projecteuclid.org/euclid.aos/1425398508

Digital Object Identifier
doi:10.1214/15-AOS1311

Mathematical Reviews number (MathSciNet)
MR3319143

Zentralblatt MATH identifier
1321.65015

Subjects
Primary: 65C50: Other computational problems in probability 65C60: Computational problems in statistics 68W20: Randomized algorithms

Keywords
Unbiased estimator Poisson estimator Monte Carlo methods sign problem Bernoulli factory

Citation

Jacob, Pierre E.; Thiery, Alexandre H. On nonnegative unbiased estimators. Ann. Statist. 43 (2015), no. 2, 769--784. doi:10.1214/15-AOS1311. https://projecteuclid.org/euclid.aos/1425398508


Export citation

References

  • Ahn, S., Korattikara, A. and Welling, M. (2012). Bayesian posterior sampling via stochastic gradient Fisher scoring. In Proceedings of the 29th International Conference on Machine Learning (ICML-12) 1591–1598.
  • Andrieu, C., Doucet, A. and Holenstein, R. (2010). Particle Markov chain Monte Carlo (with discussion). J. Roy. Statist. Soc. Ser. B 72 357–385.
  • Andrieu, C. and Roberts, G. O. (2009). The pseudo-marginal approach for efficient Monte Carlo computations. Ann. Statist. 37 697–725.
  • Bardenet, R., Doucet, A. and Holmes, C. (2014). Towards scaling up Markov chain Monte Carlo: An adaptive subsampling approach. In Proceedings of the 31st International Conference on Machine Learning (ICML-14) 405–413.
  • Beaumont, M. A. (2003). Estimation of population growth or decline in genetically monitored populations. Genetics 164 1139–1160.
  • Berger, J. O., Bernardo, J. M. and Sun, D. (2009). The formal definition of reference priors. Ann. Statist. 37 905–938.
  • Beskos, A., Papaspiliopoulos, O. and Roberts, G. O. (2006). Retrospective exact simulation of diffusion sample paths with applications. Bernoulli 12 1077–1098.
  • Beskos, A. and Roberts, G. O. (2005). Exact simulation of diffusions. Ann. Appl. Probab. 15 2422–2444.
  • Beskos, A., Papaspiliopoulos, O., Roberts, G. O. and Fearnhead, P. (2006). Exact and computationally efficient likelihood-based estimation for discretely observed diffusion processes (with discussion). J. R. Stat. Soc. Ser. B. Stat. Methodol. 68 333–382.
  • Bhanot, G. and Kennedy, A. D. (1985). Bosonic lattice gauge theory with noise. Phys. Lett. B 157 70–76.
  • Ceperley, D. M. and Dewing, M. (1999). The penalty method for random walks with uncertain energies. J. Chem. Phys. 110 9812.
  • Chen, T., Fox, E. B. and Guestrin, C. (2014). Stochastic gradient Hamiltonian Monte Carlo. In Proceedings of the 31st International Conference on Machine Learning (ICML-14) 1683–1691.
  • Del Moral, P., Doucet, A. and Jasra, A. (2007). Sequential Monte Carlo for Bayesian computation. In Bayesian Statistics 8: Proceedings of the Eighth Valencia International Meeting, June 2–6, 2006 (J. M. Bernardo, M. J. Bayarri, J. O. Degroot, A. P. Dawid, D. Heckerman, A. M. Smith and M. West, eds.) 115–148. Oxford Univ. Press, Oxford.
  • Everitt, R. G. (2012). Bayesian parameter estimation for latent Markov random fields and social networks. J. Comput. Graph. Statist. 21 940–960.
  • Fearnhead, P., Papaspiliopoulos, O. and Roberts, G. O. (2008). Particle filters for partially observed diffusions. J. R. Stat. Soc. Ser. B Stat. Methodol. 70 755–777.
  • Fearnhead, P., Papaspiliopoulos, O., Roberts, G. O. and Stuart, A. (2010). Random-weight particle filtering of continuous time processes. J. R. Stat. Soc. Ser. B Stat. Methodol. 72 497–512.
  • Flegal, J. M. and Herbei, R. (2012). Exact sampling for intractable probability distributions via a Bernoulli factory. Electron. J. Stat. 6 10–37.
  • Frei, M. and Künsch, H. R. (2013). Bridging the ensemble Kalman and particle filters. Biometrika 100 781–800.
  • Girolami, M., Lyne, A. M., Strathmann, H., Simpson, D. and Atchade, Y. (2013). Playing Russian roulette with intractable likelihoods. Preprint. Available at arXiv:1306.4032.
  • Jourdain, B. and Sbai, M. (2007). Exact retrospective Monte Carlo computation of arithmetic average Asian options. Monte Carlo Methods Appl. 13 135–171.
  • Keane, M. S. and O’Brien, G. L. (1994). A Bernoulli factory. ACM Trans. Model. Comput. Simul. 4 213–219.
  • Kennedy, A. D. and Kuti, J. (1985). Noise without noise: A new Monte Carlo method. Phys. Rev. Lett. 54 2473–2476.
  • Kleiner, A., Talwalkar, A., Sarkar, P. and Jordan, M. I. (2014). A scalable bootstrap for massive data. J. Roy. Statist. Soc. Ser. B 76 795–816.
  • Kuti, J. (1982). Stochastic method for the numerical study of lattice fermions. Phys. Rev. Lett. 49 183–186.
  • Łatuszyński, K., Kosmidis, I., Papaspiliopoulos, O. and Roberts, G. O. (2011). Simulating events of unknown probabilities via reverse time martingales. Random Structures Algorithms 38 441–452.
  • Lin, L., Liu, K. F. and Sloan, J. (2000). A noisy Monte Carlo algorithm. Phys. Rev. D 61 074505.
  • Liu, J. S. and Chen, R. (1998). Sequential Monte Carlo methods for dynamic systems. J. Amer. Statist. Assoc. 93 1032–1044.
  • Maclaurin, D. and Adams, R. P. (2014). Firefly Monte Carlo: Exact MCMC with subsets of data. Preprint. Available at arXiv:1403.5693.
  • Marin, J.-M., Pudlo, P., Robert, C. P. and Ryder, R. J. (2012). Approximate Bayesian computational methods. Stat. Comput. 22 1167–1180.
  • McLeish, D. (2011). A general method for debiasing a Monte Carlo estimator. Monte Carlo Methods Appl. 17 301–315.
  • Møller, J., Pettitt, A. N., Reeves, R. and Berthelsen, K. K. (2006). An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants. Biometrika 93 451–458.
  • Nacu, Ş. and Peres, Y. (2005). Fast simulation of new coins from old. Ann. Appl. Probab. 15 93–115.
  • Nicholls, G. K., Fox, C. and Watt, A. M. (2012). Coupled MCMC with a randomized acceptance probability. Preprint. Available at arXiv:1205.6857.
  • Olsson, J. and Ströjby, J. (2011). Particle-based likelihood inference in partially observed diffusion processes using generalised Poisson estimators. Electron. J. Stat. 5 1090–1122.
  • Papaspiliopoulos, O. (2011). A methodological framework for Monte Carlo probabilistic inference for diffusion processes. In Bayesian Time Series Models (D. Barber, A. T. Cemgil and S. Chiappa, eds.) 82–99. Cambridge Univ. Press, Cambridge.
  • Rhee, C.-h. and Glynn, P. W. (2012). A new approach to unbiased estimation for SDE’s. In Proceedings of the Winter Simulation Conference. 17:1–17:7. Winter Simulation Conference, Berlin.
  • Rhee, C.-h. and Glynn, P. W. (2013). Unbiased estimation with square root convergence for SDE models. Technical report, Stanford Univ., Stanford, CA.
  • Rychlik, T. (1990). Unbiased nonparametric estimation of the derivative of the mean. Statist. Probab. Lett. 10 329–333.
  • Rychlik, T. (1995). A class of unbiased kernel estimates of a probability density function. Appl. Math. (Warsaw) 22 485–497.
  • Sermaidis, G., Papaspiliopoulos, O., Roberts, G. O., Beskos, A. and Fearnhead, P. (2015). Markov chain Monte Carlo for exact inference for diffusions. Scand. J. Stat. 40 294–321.
  • Thomas, A. C. and Blanchet, J. H. (2011). A practical implementation of the Bernoulli factory. Preprint. Available at arXiv:1106.2508.
  • Tran, M. N., Scharth, M., Pitt, M. K. and Kohn, R. (2013). Importance sampling squared for Bayesian inference in latent variable models Preprint. Available at arXiv:1309.3339.
  • Troyer, M. and Wiese, U.-J. (2005). Computational complexity and fundamental limitations to fermionic quantum Monte Carlo simulations. Phys. Rev. Lett. 94 170201.
  • Wagner, W. (1987). Unbiased Monte Carlo evaluation of certain functional integrals. J. Comput. Phys. 71 21–33.
  • Walker, S. G. (2011). Posterior sampling when the normalizing constant is unknown. Comm. Statist. Simulation Comput. 40 784–792.
  • Welling, M. and Teh, Y. W. (2011). Bayesian learning via stochastic gradient Langevin dynamics. In Proceedings of the 28th International Conference on Machine Learning (ICML-11) 681–688.