Open Access
2017 A note on the approximate admissibility of regularized estimators in the Gaussian sequence model
Xi Chen, Adityanand Guntuboyina, Yuchen Zhang
Electron. J. Statist. 11(2): 4746-4768 (2017). DOI: 10.1214/17-EJS1354

Abstract

We study the problem of estimating an unknown vector $\theta $ from an observation $X$ drawn according to the normal distribution with mean $\theta $ and identity covariance matrix under the knowledge that $\theta $ belongs to a known closed convex set $\Theta $. In this general setting, Chatterjee (2014) proved that the natural constrained least squares estimator is “approximately admissible” for every $\Theta $. We extend this result by proving that the same property holds for all convex penalized estimators as well. Moreover, we simplify and shorten the original proof considerably. We also provide explicit upper and lower bounds for the universal constant underlying the notion of approximate admissibility.

Citation

Download Citation

Xi Chen. Adityanand Guntuboyina. Yuchen Zhang. "A note on the approximate admissibility of regularized estimators in the Gaussian sequence model." Electron. J. Statist. 11 (2) 4746 - 4768, 2017. https://doi.org/10.1214/17-EJS1354

Information

Received: 1 March 2017; Published: 2017
First available in Project Euclid: 24 November 2017

zbMATH: 06816632
MathSciNet: MR3729658
Digital Object Identifier: 10.1214/17-EJS1354

Keywords: Admissibility , Bayes risk , Gaussian sequence model , least squares estimator , minimaxity

Vol.11 • No. 2 • 2017
Back to Top