Translator Disclaimer
2019 Central limit theorems for entropy-regularized optimal transport on finite spaces and statistical applications
Jérémie Bigot, Elsa Cazelles, Nicolas Papadakis
Electron. J. Statist. 13(2): 5120-5150 (2019). DOI: 10.1214/19-EJS1637


The notion of entropy-regularized optimal transport, also known as Sinkhorn divergence, has recently gained popularity in machine learning and statistics, as it makes feasible the use of smoothed optimal transportation distances for data analysis. The Sinkhorn divergence allows the fast computation of an entropically regularized Wasserstein distance between two probability distributions supported on a finite metric space of (possibly) high-dimension. For data sampled from one or two unknown probability distributions, we derive the distributional limits of the empirical Sinkhorn divergence and its centered version (Sinkhorn loss). We also propose a bootstrap procedure which allows to obtain new test statistics for measuring the discrepancies between multivariate probability distributions. Our work is inspired by the results of Sommerfeld and Munk in [33] on the asymptotic distribution of empirical Wasserstein distance on finite space using unregularized transportation costs. Incidentally we also analyze the asymptotic distribution of entropy-regularized Wasserstein distances when the regularization parameter tends to zero. Simulated and real datasets are used to illustrate our approach.


Download Citation

Jérémie Bigot. Elsa Cazelles. Nicolas Papadakis. "Central limit theorems for entropy-regularized optimal transport on finite spaces and statistical applications." Electron. J. Statist. 13 (2) 5120 - 5150, 2019.


Received: 1 February 2019; Published: 2019
First available in Project Euclid: 12 December 2019

zbMATH: 07147373
MathSciNet: MR4041704
Digital Object Identifier: 10.1214/19-EJS1637

Primary: 62G10, 62G20, 65C60


Vol.13 • No. 2 • 2019
Back to Top