April 2023 Minimax rates for conditional density estimation via empirical entropy
Blair Bilodeau, Dylan J. Foster, Daniel M. Roy
Author Affiliations +
Ann. Statist. 51(2): 762-790 (April 2023). DOI: 10.1214/23-AOS2270

Abstract

We consider the task of estimating a conditional density using i.i.d. samples from a joint distribution, which is a fundamental problem with applications in both classification and uncertainty quantification for regression. For joint density estimation, minimax rates have been characterized for general density classes in terms of uniform (metric) entropy, a well-studied notion of statistical capacity. When applying these results to conditional density estimation, the use of uniform entropy—which is infinite when the covariate space is unbounded and suffers from the curse of dimensionality—can lead to suboptimal rates. Consequently, minimax rates for conditional density estimation cannot be characterized using these classical results.

We resolve this problem for well-specified models, obtaining matching (within logarithmic factors) upper and lower bounds on the minimax Kullback–Leibler risk in terms of the empirical Hellinger entropy for the conditional density class. The use of empirical entropy allows us to appeal to concentration arguments based on local Rademacher complexity, which—in contrast to uniform entropy—leads to matching rates for large, potentially nonparametric classes and captures the correct dependence on the complexity of the covariate space. Our results require only that the conditional densities are bounded above, and do not require that they are bounded below or otherwise satisfy any tail conditions.

Funding Statement

BB acknowledges support from an NSERC Canada Graduate Scholarship and the Vector Institute. DMR is supported in part by an NSERC Discovery Grant and an Ontario Early Researcher Award. This material is based also upon work supported by the United States Air Force under Contract No. FA850-19-C-0511.

Acknowledgments

The authors thank Jeffrey Negrea, Yanbo Tang, and Yuhong Yang for helpful discussions and comments, and Abhishek Shetty for pointing out that Theorem 5 should use bracketing entropy rather than metric entropy.

Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Air Force.

Citation

Download Citation

Blair Bilodeau. Dylan J. Foster. Daniel M. Roy. "Minimax rates for conditional density estimation via empirical entropy." Ann. Statist. 51 (2) 762 - 790, April 2023. https://doi.org/10.1214/23-AOS2270

Information

Received: 1 September 2022; Published: April 2023
First available in Project Euclid: 13 June 2023

zbMATH: 07714180
MathSciNet: MR4601001
Digital Object Identifier: 10.1214/23-AOS2270

Subjects:
Primary: 62G05 , 62G07
Secondary: 62C20 , 94A17

Keywords: Conditional density estimation , empirical entropy , logarithmic loss , Minimax rates , nonparametric estimation

Rights: Copyright © 2023 Institute of Mathematical Statistics

JOURNAL ARTICLE
29 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.51 • No. 2 • April 2023
Back to Top