Translator Disclaimer
February 2019 Efficient multivariate entropy estimation via $k$-nearest neighbour distances
Thomas B. Berrett, Richard J. Samworth, Ming Yuan
Ann. Statist. 47(1): 288-318 (February 2019). DOI: 10.1214/18-AOS1688


Many statistical procedures, including goodness-of-fit tests and methods for independent component analysis, rely critically on the estimation of the entropy of a distribution. In this paper, we seek entropy estimators that are efficient and achieve the local asymptotic minimax lower bound with respect to squared error loss. To this end, we study weighted averages of the estimators originally proposed by Kozachenko and Leonenko [Probl. Inform. Transm. 23 (1987), 95–101], based on the $k$-nearest neighbour distances of a sample of $n$ independent and identically distributed random vectors in $\mathbb{R}^{d}$. A careful choice of weights enables us to obtain an efficient estimator in arbitrary dimensions, given sufficient smoothness, while the original unweighted estimator is typically only efficient when $d\leq 3$. In addition to the new estimator proposed and theoretical understanding provided, our results facilitate the construction of asymptotically valid confidence intervals for the entropy of asymptotically minimal width.


Download Citation

Thomas B. Berrett. Richard J. Samworth. Ming Yuan. "Efficient multivariate entropy estimation via $k$-nearest neighbour distances." Ann. Statist. 47 (1) 288 - 318, February 2019.


Received: 1 June 2017; Revised: 1 November 2017; Published: February 2019
First available in Project Euclid: 30 November 2018

zbMATH: 07036202
MathSciNet: MR3909934
Digital Object Identifier: 10.1214/18-AOS1688

Primary: 62G05, 62G20

Rights: Copyright © 2019 Institute of Mathematical Statistics


This article is only available to subscribers.
It is not available for individual sale.

Vol.47 • No. 1 • February 2019
Back to Top