Open Access
November 2019 Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures
Wenhua Jiang, Cun-Hui Zhang
Bernoulli 25(4B): 3400-3420 (November 2019). DOI: 10.3150/18-BEJ1094


We study a nonparametric likelihood ratio test (NPLRT) for Gaussian mixtures. It is based on the nonparametric maximum likelihood estimator in the context of demixing. The test concerns if a random sample is from the standard normal distribution. We consider mixing distributions of unbounded support for alternative hypothesis. We prove that the divergence rate of the NPLRT under the null is bounded by $\log n$, provided that the support range of the mixing distribution increases no faster than $(\log n/\log 9)^{1/2}$. We prove that the rate of $\sqrt{\log n}$ is a lower bound for the divergence rate if the support range increases no slower than the order of $\sqrt{\log n}$. Implications of the upper bound for the rate of divergence are discussed.


Download Citation

Wenhua Jiang. Cun-Hui Zhang. "Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures." Bernoulli 25 (4B) 3400 - 3420, November 2019.


Received: 1 July 2017; Revised: 1 November 2018; Published: November 2019
First available in Project Euclid: 25 September 2019

zbMATH: 07110142
MathSciNet: MR4010959
Digital Object Identifier: 10.3150/18-BEJ1094

Keywords: Gaussian mixtures , Hermite polynomials , likelihood ratio test , rate of divergence , two-component mixtures

Rights: Copyright © 2019 Bernoulli Society for Mathematical Statistics and Probability

Vol.25 • No. 4B • November 2019
Back to Top