Open Access
August 1996 Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields
Priscilla E. Greenwood, Ian W. McKeague, Wolfgang Wefelmeyer
Ann. Statist. 24(4): 1433-1456 (August 1996). DOI: 10.1214/aos/1032298276

Abstract

Given a Markov chain sampling scheme, does the standard empirical estimator make best use of the data? We show that this is not so and construct better estimators. We restrict attention to nearest-neighbor random fields and to Gibbs samplers with deterministic sweep, but our approach applies to any sampler that uses reversible variable-at-a-time updating with deterministic sweep. The structure of the transition distribution of the sampler is exploited to construct further empirical estimators that are combined with the standard empirical estimator to reduce asymptotic variance. The extra computational cost is negligible. When the random field is spatially homogeneous, symmetrizations of our estimator lead to further variance reduction. The performance of the estimators is evaluated in a simulation study of the Ising model.

Citation

Download Citation

Priscilla E. Greenwood. Ian W. McKeague. Wolfgang Wefelmeyer. "Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields." Ann. Statist. 24 (4) 1433 - 1456, August 1996. https://doi.org/10.1214/aos/1032298276

Information

Published: August 1996
First available in Project Euclid: 17 September 2002

zbMATH: 0871.62083
MathSciNet: MR1416641
Digital Object Identifier: 10.1214/aos/1032298276

Subjects:
Primary: 62M40 , 65U05
Secondary: 60J05 , 62G20 , 62M05

Keywords: Asymptotic relative efficiency , Ising model , Markov chain Monte Carlo , Metropolis-Hastings algorithm , parallel updating , variance reduction

Rights: Copyright © 1996 Institute of Mathematical Statistics

Vol.24 • No. 4 • August 1996
Back to Top