Translator Disclaimer
2018 A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation
Ramji Venkataramanan, Oliver Johnson
Electron. J. Statist. 12(1): 1126-1149 (2018). DOI: 10.1214/18-EJS1419

Abstract

In statistical inference problems, we wish to obtain lower bounds on the minimax risk, that is to bound the performance of any possible estimator. A standard technique to do this involves the use of Fano’s inequality. However, recent work in an information-theoretic setting has shown that an argument based on binary hypothesis testing gives tighter converse results (error lower bounds) than Fano for channel coding problems. We adapt this technique to the statistical setting, and argue that Fano’s inequality can always be replaced by this approach to obtain tighter lower bounds that can be easily computed and are asymptotically sharp. We illustrate our technique in three applications: density estimation, active learning of a binary classifier, and compressed sensing, obtaining tighter risk lower bounds in each case.

Citation

Download Citation

Ramji Venkataramanan. Oliver Johnson. "A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation." Electron. J. Statist. 12 (1) 1126 - 1149, 2018. https://doi.org/10.1214/18-EJS1419

Information

Received: 1 June 2017; Published: 2018
First available in Project Euclid: 27 March 2018

zbMATH: 06864487
MathSciNet: MR3780042
Digital Object Identifier: 10.1214/18-EJS1419

Subjects:
Primary: 62B10, 62G05, 62G07

JOURNAL ARTICLE
24 PAGES


SHARE
Vol.12 • No. 1 • 2018
Back to Top