Translator Disclaimer
2019 Convergence rates of latent topic models under relaxed identifiability conditions
Yining Wang
Electron. J. Statist. 13(1): 37-66 (2019). DOI: 10.1214/18-EJS1516

Abstract

In this paper we study the frequentist convergence rate for the Latent Dirichlet Allocation (Blei, Ng and Jordan, 2003) topic models. We show that the maximum likelihood estimator converges to one of the finitely many equivalent parameters in Wasserstein’s distance metric at a rate of $n^{-1/4}$ without assuming separability or non-degeneracy of the underlying topics and/or the existence of more than three words per document, thus generalizing the previous works of Anandkumar et al. (2012, 2014) from an information-theoretical perspective. We also show that the $n^{-1/4}$ convergence rate is optimal in the worst case.

Citation

Download Citation

Yining Wang. "Convergence rates of latent topic models under relaxed identifiability conditions." Electron. J. Statist. 13 (1) 37 - 66, 2019. https://doi.org/10.1214/18-EJS1516

Information

Received: 1 March 2018; Published: 2019
First available in Project Euclid: 4 January 2019

zbMATH: 07003257
MathSciNet: MR3896145
Digital Object Identifier: 10.1214/18-EJS1516

JOURNAL ARTICLE
30 PAGES


SHARE
Vol.13 • No. 1 • 2019
Back to Top