Translator Disclaimer
2017 Sharp minimax adaptation over Sobolev ellipsoids in nonparametric testing
Pengsheng Ji, Michael Nussbaum
Electron. J. Statist. 11(2): 4515-4562 (2017). DOI: 10.1214/17-EJS1353


In the problem of testing for signal in Gaussian white noise, over a smoothness class with an $L_{2}$-ball removed, minimax rates of convergences (separation rates) are well known (Ingster [24]); they are expressed in the rate of the ball radius tending to zero along with noise intensity, such that a nontrivial asymptotic power is possible. It is also known that, if the smoothness class is a Sobolev type ellipsoid of degree $\beta$ and size $M$, the optimal rate result can be sharpened towards a Pinsker type asymptotics for the critical radius (Ermakov [9]). The minimax optimal tests in that setting depend on $\beta$ and $M$; but whereas in nonparametric estimation with squared $L_{2}$-loss, adaptive estimators attaining the Pinsker constant are known, the analogous problem in testing is open. First, for adaptation to $M$ only, we establish that it is not possible at the critical separation rate, but is possible in the sense of the asymptotics of tail error probabilities at slightly slower rates. For full adaptation to $(\beta,M)$, it is well known that a $\log\log n$-penalty over the separation rate is incurred. We extend a preliminary result of Ingster and Suslina [25] relating to fixed $M$ and unknown $\beta$, and establish that sharp minimax adaptation to both parameters is possible. Thus a complete solution is obtained, in the basic $L_{2}$-case, to the problem of adaptive nonparametric testing at the level of asymptotic minimax constants.


Download Citation

Pengsheng Ji. Michael Nussbaum. "Sharp minimax adaptation over Sobolev ellipsoids in nonparametric testing." Electron. J. Statist. 11 (2) 4515 - 4562, 2017.


Received: 1 May 2016; Published: 2017
First available in Project Euclid: 17 November 2017

zbMATH: 1384.62148
MathSciNet: MR3724488
Digital Object Identifier: 10.1214/17-EJS1353

Primary: 62G10, 62G20


Vol.11 • No. 2 • 2017
Back to Top