Open Access
Translator Disclaimer
2021 Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM
Nimrod Segol, Boaz Nadler
Author Affiliations +
Electron. J. Statist. 15(2): 4510-4544 (2021). DOI: 10.1214/21-EJS1905


We consider the problem of estimating the parameters a Gaussian Mixture Model with K components of known weights, all with an identity covariance matrix. We make two contributions. First, at the population level, we present a sharper analysis of the local convergence of EM and gradient EM, compared to previous works. Assuming a separation of Ω( logK), we prove convergence of both methods to the global optima from an initialization region larger than those of previous works. Specifically, the initial guess of each component can be as far as (almost) half its distance to the nearest Gaussian. This is essentially the largest possible contraction region. Our second contribution are improved sample size requirements for accurate estimation by EM and gradient EM. In previous works, the required number of samples had a quadratic dependence on the maximal separation between the K components, and the resulting error estimate increased linearly with this maximal separation. In this manuscript we show that both quantities depend only logarithmically on the maximal separation.

Funding Statement

This research was partially supported by the Israeli Council for Higher Education (CHE) via the Weizmann Data Science Research Center, and by a research grant from the Estate of Tully and Michele Plesser.


We thank the associate editor and anonymous referee for several constructive suggestions.


Download Citation

Nimrod Segol. Boaz Nadler. "Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM." Electron. J. Statist. 15 (2) 4510 - 4544, 2021.


Received: 1 January 2021; Published: 2021
First available in Project Euclid: 23 September 2021

Digital Object Identifier: 10.1214/21-EJS1905

Primary: 62F10
Secondary: 62F99

Keywords: EM algorithm , Gaussian mixture models


Vol.15 • No. 2 • 2021
Back to Top