Open Access
Translator Disclaimer
2009 Asymptotic properties of predictive recursion: Robustness and rate of convergence
Ryan Martin, Surya T. Tokdar
Electron. J. Statist. 3: 1455-1472 (2009). DOI: 10.1214/09-EJS458

Abstract

Here we explore general asymptotic properties of Predictive Recursion (PR) for nonparametric estimation of mixing distributions. We prove that, when the mixture model is mis-specified, the estimated mixture converges almost surely in total variation to the mixture that minimizes the Kullback-Leibler divergence, and a bound on the (Hellinger contrast) rate of convergence is obtained. Simulations suggest that this rate is nearly sharp in a minimax sense. Moreover, when the model is identifiable, almost sure weak convergence of the mixing distribution estimate follows.

PR assumes that the support of the mixing distribution is known. To remove this requirement, we propose a generalization that incorporates a sequence of supports, increasing with the sample size, that combines the efficiency of PR with the flexibility of mixture sieves. Under mild conditions, we obtain a bound on the rate of convergence of these new estimates.

Citation

Download Citation

Ryan Martin. Surya T. Tokdar. "Asymptotic properties of predictive recursion: Robustness and rate of convergence." Electron. J. Statist. 3 1455 - 1472, 2009. https://doi.org/10.1214/09-EJS458

Information

Published: 2009
First available in Project Euclid: 24 December 2009

zbMATH: 1326.62107
MathSciNet: MR2578833
Digital Object Identifier: 10.1214/09-EJS458

Subjects:
Primary: 62G20
Secondary: 62G05, 62G07, 62G35

Rights: Copyright © 2009 The Institute of Mathematical Statistics and the Bernoulli Society

JOURNAL ARTICLE
18 PAGES


SHARE
Back to Top