Abstract
Gaussian mixture models form a flexible and expressive parametric family of distributions that has found a variety of applications. Unfortunately, fitting these models to data is a notoriously hard problem from a computational perspective. Currently, only moment-based methods enjoy theoretical guarantees while likelihood-based methods are dominated by heuristics such as Expectation-Maximization that are known to fail in simple examples. In this work, we propose a new algorithm to compute the nonparametric maximum likelihood estimator (NPMLE) in a Gaussian mixture model. Our method is based on gradient descent over the space of probability measures equipped with the Wasserstein–Fisher–Rao geometry for which we establish convergence guarantees. In practice, it can be approximated using an interacting particle system where the weight and location of particles are updated alternately. We conduct extensive numerical experiments to confirm the effectiveness of the proposed algorithm compared not only to classical benchmarks but also to similar gradient descent algorithms with respect to simpler geometries. In particular, these simulations illustrate the benefit of updating both weight and location of the interacting particles.
Funding Statement
Y. Yan was supported in part by the Charlotte Elizabeth Procter Honorific Fellowship from Princeton University and the Norbert Wiener Postdoctoral Fellowship from MIT.
K. Wang is supported by NSF Grant DMS-2210907 and a start-up grant at Columbia University.
P. Rigollet is supported by NSF Grants IIS-1838071, DMS-2022448 and CCF-2106377.
Acknowledgments
The authors thank Donghao Wang for a helpful discussion.
Part of this work was done during Y. Yan’s visit to MIT in Fall 2022.
Yuling Yan and Kaizheng Wang contributed equally.
Citation
Yuling Yan. Kaizheng Wang. Philippe Rigollet. "Learning Gaussian mixtures using the Wasserstein–Fisher–Rao gradient flow." Ann. Statist. 52 (4) 1774 - 1795, August 2024. https://doi.org/10.1214/24-AOS2416
Information