Open Access
February 2012 Minimax and Adaptive Inference in Nonparametric Function Estimation
T. Tony Cai
Statist. Sci. 27(1): 31-50 (February 2012). DOI: 10.1214/11-STS355

Abstract

Since Stein’s 1956 seminal paper, shrinkage has played a fundamental role in both parametric and nonparametric inference. This article discusses minimaxity and adaptive minimaxity in nonparametric function estimation. Three interrelated problems, function estimation under global integrated squared error, estimation under pointwise squared error, and nonparametric confidence intervals, are considered. Shrinkage is pivotal in the development of both the minimax theory and the adaptation theory.

While the three problems are closely connected and the minimax theories bear some similarities, the adaptation theories are strikingly different. For example, in a sharp contrast to adaptive point estimation, in many common settings there do not exist nonparametric confidence intervals that adapt to the unknown smoothness of the underlying function. A concise account of these theories is given. The connections as well as differences among these problems are discussed and illustrated through examples.

Citation

Download Citation

T. Tony Cai. "Minimax and Adaptive Inference in Nonparametric Function Estimation." Statist. Sci. 27 (1) 31 - 50, February 2012. https://doi.org/10.1214/11-STS355

Information

Published: February 2012
First available in Project Euclid: 14 March 2012

zbMATH: 1330.62059
MathSciNet: MR2953494
Digital Object Identifier: 10.1214/11-STS355

Keywords: Adaptation , adaptive estimation , Bayes minimax , Besov ball , block thresholding , Confidence interval , ellipsoid , information pooling , linear functional , linear minimaxity , minimax , Nonparametric regression , oracle , separable rules , sequence model , shrinkage , thresholding , ‎wavelet , White noise model

Rights: Copyright © 2012 Institute of Mathematical Statistics

Vol.27 • No. 1 • February 2012
Back to Top