Translator Disclaimer
November 2019 Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures
Wenhua Jiang, Cun-Hui Zhang
Bernoulli 25(4B): 3400-3420 (November 2019). DOI: 10.3150/18-BEJ1094

Abstract

We study a nonparametric likelihood ratio test (NPLRT) for Gaussian mixtures. It is based on the nonparametric maximum likelihood estimator in the context of demixing. The test concerns if a random sample is from the standard normal distribution. We consider mixing distributions of unbounded support for alternative hypothesis. We prove that the divergence rate of the NPLRT under the null is bounded by $\log n$, provided that the support range of the mixing distribution increases no faster than $(\log n/\log 9)^{1/2}$. We prove that the rate of $\sqrt{\log n}$ is a lower bound for the divergence rate if the support range increases no slower than the order of $\sqrt{\log n}$. Implications of the upper bound for the rate of divergence are discussed.

Citation

Download Citation

Wenhua Jiang. Cun-Hui Zhang. "Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures." Bernoulli 25 (4B) 3400 - 3420, November 2019. https://doi.org/10.3150/18-BEJ1094

Information

Received: 1 July 2017; Revised: 1 November 2018; Published: November 2019
First available in Project Euclid: 25 September 2019

zbMATH: 07110142
MathSciNet: MR4010959
Digital Object Identifier: 10.3150/18-BEJ1094

Rights: Copyright © 2019 Bernoulli Society for Mathematical Statistics and Probability

JOURNAL ARTICLE
21 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

SHARE
Vol.25 • No. 4B • November 2019
Back to Top