Abstract
We study a nonparametric likelihood ratio test (NPLRT) for Gaussian mixtures. It is based on the nonparametric maximum likelihood estimator in the context of demixing. The test concerns if a random sample is from the standard normal distribution. We consider mixing distributions of unbounded support for alternative hypothesis. We prove that the divergence rate of the NPLRT under the null is bounded by $\log n$, provided that the support range of the mixing distribution increases no faster than $(\log n/\log 9)^{1/2}$. We prove that the rate of $\sqrt{\log n}$ is a lower bound for the divergence rate if the support range increases no slower than the order of $\sqrt{\log n}$. Implications of the upper bound for the rate of divergence are discussed.
Citation
Wenhua Jiang. Cun-Hui Zhang. "Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures." Bernoulli 25 (4B) 3400 - 3420, November 2019. https://doi.org/10.3150/18-BEJ1094
Information