Translator Disclaimer
August 2000 Rates of convergence for the Gaussian mixture sieve
Christopher R. Genovese, Larry Wasserman
Ann. Statist. 28(4): 1105-1127 (August 2000). DOI: 10.1214/aos/1015956709


Gaussian mixtures provide a convenient method of density estimation that lies somewhere between parametric models and kernel density estimators.When the number of components of the mixture is allowed to increase as sample size increases, the model is called a mixture sieve.We establish a bound on the rate of convergence in Hellinger distance for density estimation using the Gaussian mixture sieve assuming that the true density is itself a mixture of Gaussians; the underlying mixing measure of the true density is not necessarily assumed to have finite support. Computing the rate involves some delicate calculations since the size of the sieve—as measured by bracketing entropy—and the saturation rate, cannot be found using standard methods.When the mixing measure has compact support, using $k_n \sim n^{2/3}/(\log n)^{1/3}$ components in the mixture yields a rate of order $(\log n)^{(1+\eta)/6}/n^{1/6}$ for every $\eta > 0$. The rates depend heavilyon the tail behavior of the true density.The sensitivity to the tail behavior is dimin- ished byusing a robust sieve which includes a long-tailed component in the mixture.In the compact case,we obtain an improved rate of $(\log n/n)^{1/4}$. In the noncompact case, a spectrum of interesting rates arise depending on the thickness of the tails of the mixing measure.


Download Citation

Christopher R. Genovese. Larry Wasserman. "Rates of convergence for the Gaussian mixture sieve." Ann. Statist. 28 (4) 1105 - 1127, August 2000.


Published: August 2000
First available in Project Euclid: 12 March 2002

zbMATH: 1105.62333
MathSciNet: MR1810921
Digital Object Identifier: 10.1214/aos/1015956709

Primary: none

Rights: Copyright © 2000 Institute of Mathematical Statistics


Vol.28 • No. 4 • August 2000
Back to Top