Open Access
November 2015 Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families
Paul Vos, Qiang Wu
Bernoulli 21(4): 2120-2138 (November 2015). DOI: 10.3150/14-BEJ637

Abstract

We employ a parameter-free distribution estimation framework where estimators are random distributions and utilize the Kullback–Leibler (KL) divergence as a loss function. Wu and Vos [ J. Statist. Plann. Inference 142 (2012) 1525–1536] show that when an estimator obtained from an i.i.d. sample is viewed as a random distribution, the KL risk of the estimator decomposes in a fashion parallel to the mean squared error decomposition when the estimator is a real-valued random variable. In this paper, we explore how conditional versions of distribution expectation ($E^{\dagger}$) can be defined so that a distribution version of the Rao–Blackwell theorem holds. We define distributional expectation and variance ($V^{\dagger}$) that also provide a decomposition of KL risk in exponential and mixture families. For exponential families, we show that the maximum likelihood estimator (viewed as a random distribution) is distribution unbiased and is the unique uniformly minimum distribution variance unbiased (UMV$^{\dagger}$U) estimator. Furthermore, we show that the MLE is robust against model specification in that if the true distribution does not belong to the exponential family, the MLE is UMV$^{\dagger}$U for the KL projection of the true distribution onto the exponential families provided these two distribution have the same expectation for the canonical statistic. To allow for estimators taking values outside of the exponential family, we include results for KL projection and define an extended projection to accommodate the non-existence of the MLE for families having discrete sample space. Illustrative examples are provided.

Citation

Download Citation

Paul Vos. Qiang Wu. "Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families." Bernoulli 21 (4) 2120 - 2138, November 2015. https://doi.org/10.3150/14-BEJ637

Information

Received: 1 May 2013; Revised: 1 April 2014; Published: November 2015
First available in Project Euclid: 5 August 2015

zbMATH: 06502618
MathSciNet: MR3378461
Digital Object Identifier: 10.3150/14-BEJ637

Keywords: distribution unbiasedness , extended KL projection , Kullback–Leibler loss , MVUE , Pythagorean relationship , Rao–Blackwell

Rights: Copyright © 2015 Bernoulli Society for Mathematical Statistics and Probability

Vol.21 • No. 4 • November 2015
Back to Top