Open Access
November 2018 A new approach to estimator selection
O.V. Lepski
Bernoulli 24(4A): 2776-2810 (November 2018). DOI: 10.3150/17-BEJ945


In the framework of an abstract statistical model, we discuss how to use the solution of one estimation problem (Problem A) in order to construct an estimator in another, completely different, Problem B. As a solution of Problem A we understand a data-driven selection from a given family of estimators $\mathbf{A}(\mathfrak{H})=\{\widehat{A}_{\mathfrak{h}},\mathfrak{h}\in\mathfrak{H}\}$ and establishing for the selected estimator so-called oracle inequality. If $\hat{\mathfrak{h}}\in\mathfrak{H}$ is the selected parameter and $\mathbf{B}(\mathfrak{H})=\{\widehat{B}_{\mathfrak{h}},\mathfrak{h}\in\mathfrak{H}\}$ is an estimator’s collection built in Problem B, we suggest to use the estimator $\widehat{B}_{\hat{\mathfrak{h}}}$. We present very general selection rule led to selector $\hat{\mathfrak{h}}$ and find conditions under which the estimator $\widehat{B}_{\hat{\mathfrak{h}}}$ is reasonable. Our approach is illustrated by several examples related to adaptive estimation.


Download Citation

O.V. Lepski. "A new approach to estimator selection." Bernoulli 24 (4A) 2776 - 2810, November 2018.


Received: 1 October 2016; Revised: 1 March 2017; Published: November 2018
First available in Project Euclid: 26 March 2018

zbMATH: 06853265
MathSciNet: MR3779702
Digital Object Identifier: 10.3150/17-BEJ945

Keywords: adaptive estimation , density model , generalized deconvolution model , oracle approach , upper function

Rights: Copyright © 2018 Bernoulli Society for Mathematical Statistics and Probability

Vol.24 • No. 4A • November 2018
Back to Top