Electronic Journal of Statistics

Sharp minimax adaptation over Sobolev ellipsoids in nonparametric testing

Pengsheng Ji and Michael Nussbaum

Full-text: Open access

Abstract

In the problem of testing for signal in Gaussian white noise, over a smoothness class with an $L_{2}$-ball removed, minimax rates of convergences (separation rates) are well known (Ingster [24]); they are expressed in the rate of the ball radius tending to zero along with noise intensity, such that a nontrivial asymptotic power is possible. It is also known that, if the smoothness class is a Sobolev type ellipsoid of degree $\beta$ and size $M$, the optimal rate result can be sharpened towards a Pinsker type asymptotics for the critical radius (Ermakov [9]). The minimax optimal tests in that setting depend on $\beta$ and $M$; but whereas in nonparametric estimation with squared $L_{2}$-loss, adaptive estimators attaining the Pinsker constant are known, the analogous problem in testing is open. First, for adaptation to $M$ only, we establish that it is not possible at the critical separation rate, but is possible in the sense of the asymptotics of tail error probabilities at slightly slower rates. For full adaptation to $(\beta,M)$, it is well known that a $\log\log n$-penalty over the separation rate is incurred. We extend a preliminary result of Ingster and Suslina [25] relating to fixed $M$ and unknown $\beta$, and establish that sharp minimax adaptation to both parameters is possible. Thus a complete solution is obtained, in the basic $L_{2}$-case, to the problem of adaptive nonparametric testing at the level of asymptotic minimax constants.

Article information

Source
Electron. J. Statist., Volume 11, Number 2 (2017), 4515-4562.

Dates
Received: May 2016
First available in Project Euclid: 17 November 2017

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1510887945

Digital Object Identifier
doi:10.1214/17-EJS1353

Mathematical Reviews number (MathSciNet)
MR3724488

Zentralblatt MATH identifier
1384.62148

Subjects
Primary: 62G10: Hypothesis testing 62G20: Asymptotic properties

Keywords
Minimax hypothesis testing nonparametric signal detection sharp asymptotic adaptivity moderate deviation

Rights
Creative Commons Attribution 4.0 International License.

Citation

Ji, Pengsheng; Nussbaum, Michael. Sharp minimax adaptation over Sobolev ellipsoids in nonparametric testing. Electron. J. Statist. 11 (2017), no. 2, 4515--4562. doi:10.1214/17-EJS1353. https://projecteuclid.org/euclid.ejs/1510887945


Export citation

References

  • [1] Butucea, C., Matias, C. and Pouet, C. (2009). Adaptive goodness-of-fit testing from indirect observations., Ann. Inst. Henri Poincaré Probab. Stat. 45 352–372.
  • [2] Chen, L. H. Y., Goldstein, L. and Shao, Q.-M. (2011)., Normal Approximation by Stein’s Method. Springer, New York.
  • [3] Collier, O. (2012). Minimax hypothesis testing for curve registration., Electron. J. Stat. 6 1129–1154.
  • [4] DasGupta, A. (2008)., Asymptotic Theory of Statistics and Probability. Springer Texts in Statistics. Springer, New York.
  • [5] Dicker, L. H. (2016). Ridge regression and asymptotic minimax estimation over spheres of growing dimension., Bernoulli 22 1–37.
  • [6] Dümbgen, L. and Spokoiny, V. G. (2001). Multiscale testing of qualitative hypotheses., Ann. Statist. 29 124–152.
  • [7] Efroimovich, S. Y. and Pinsker, M. S. (1984). A self-training algorithm for nonparametric filtering., Avtomat. i Telemekh. 11 58–65.
  • [8] Ermakov, M. (2011). Nonparametric signal detection with small type I and type II error probabilities., Stat. Inference Stoch. Process. 14 1–19.
  • [9] Ermakov, M. S. (1990). Minimax detection of a signal in Gaussian white noise., Teor. Veroyatnost. i Primenen. 35 704–715.
  • [10] Ermakov, M. S. (1997). Asymptotic minimaxity of chi-squared tests., Teor. Veroyatnost. i Primenen. 42 668–695.
  • [11] Ermakov, M. S. (2003). Asymptotically efficient statistical inferences for moderate deviation probabilities., Teor. Veroyatnost. i Primenen. 48 676–700.
  • [12] Ermakov, M. S. (2008). Nonparametric hypothesis testing for small type I and type II error probabilities., Problemy Peredachi Informatsii 44 54–74.
  • [13] Fromont, M. and Laurent, B. (2006). Adaptive goodness-of-fit tests in a density model., Ann. Statist. 34 680–720.
  • [14] Fromont, M., Laurent, B. and Reynaud-Bouret, P. (2011). Adaptive tests of homogeneity for a Poisson process., Ann. Inst. Henri Poincaré Probab. Stat. 47 176–213.
  • [15] Gayraud, G. and Pouet, C. (2005). Adaptive minimax testing in the discrete regression scheme., Probab. Theory Related Fields 133 531–558.
  • [16] Golubev, G. K. (1987). Adaptive asymptotically minimax estimates for smooth signals., Problemy Peredachi Informatsii 23 57–67.
  • [17] Golubev, G. K. (1990). Quasilinear estimates for signals in $L_2$., Problemy Peredachi Informatsii 26 19–24.
  • [18] Golubev, G. K. and Nussbaum, M. (1990). A risk bound in Sobolev class regression., Ann. Statist. 18 758–778.
  • [19] Golubev, Y., Lepski, O. and Levit, B. (2001). On adaptive estimation for the sup-norm losses., Math. Methods Statist. 10 23–37.
  • [20] Götze, F. (1991). On the rate of convergence in the multivariate CLT., Ann. Probab. 19 724–739.
  • [21] Horowitz, J. L. and Spokoiny, V. G. (2001). An adaptive, rate-optimal test of a parametric mean-regression model against a nonparametric alternative., Econometrica 69 599–631.
  • [22] Inglot, T., Kallenberg, W. C. M. and Ledwina, T. (1992). Strong moderate deviation theorems., Ann. Probab. 20 987–1003.
  • [23] Ingster, Y. and Stepanova, N. (2011). Estimation and detection of functions from anisotropic Sobolev classes., Electron. J. Stat. 5 484–506.
  • [24] Ingster, Y. I. (1982). Minimax nonparametric detection of signals in white Gaussian noise., Problemy Peredachi Informatsii 18 61–73.
  • [25] Ingster, Y. I. and Suslina, I. A. (2003)., Nonparametric Goodness-of-Fit Testing under Gaussian Models. Lecture Notes in Statistics 169. Springer-Verlag, New York.
  • [26] Ingster, Y. I. and Suslina, I. A. (2005). Nonparametric hypothesis testing for small type I errors. II., Math. Methods Statist. 14 28–52.
  • [27] Korostelëv, A. P. (1993). An asymptotically minimax regression estimator in the uniform norm up to a constant., Teor. Veroyatnost. i Primenen. 38 875–882.
  • [28] Lehmann, E. L. and Romano, J. P. (2005)., Testing Statistical Hypotheses, Third ed. Springer, New York.
  • [29] Lepski, O. V. and Tsybakov, A. B. (2000). Asymptotically exact nonparametric hypothesis testing in sup-norm and at a fixed point., Probab. Theory Related Fields 117 17–48.
  • [30] Liese, F. and Miescke, K.-J. (2008)., Statistical Decision Theory. Springer, New York.
  • [31] Nussbaum, M. (1999). Minimax risk: Pinsker bound. In, Encyclopedia of Statistical Sciences, Update Volume 3 451–460. Wiley, New York.
  • [32] Petrov, V. V. (1995)., Limit Theorems of Probability Theory. The Clarendon Press, Oxford University Press, New York.
  • [33] Pinsker, M. S. (1980). Optimal filtration of square-integrable signals in Gaussian noise., Problemy Peredachi Informatsii 16 52–68.
  • [34] Rohde, A. (2008). Adaptive goodness-of-fit tests based on signed ranks., Ann. Statist. 36 1346–1374.
  • [35] Spokoiny, V. G. (1996). Adaptive hypothesis testing using wavelets., Ann. Statist. 24 2477–2498.
  • [36] Tsybakov, A. B. (2009)., Introduction to Nonparametric Estimation. Springer, New York.
  • [37] van der Vaart, A. W. (1998)., Asymptotic Statistics. Cambridge University Press, Cambridge.