The Annals of Statistics

Minimax Multiple Shrinkage Estimation

Edward I. George

Full-text: Open access

Abstract

For the canonical problem of estimating a multivariate normal mean under squared-error-loss, this article addresses the problem of selecting a minimax shrinkage estimator when vague or conflicting prior information suggests that more than one estimator from a broad class might be effective. For this situation a new class of alternative estimators, called multiple shrinkage estimators, is proposed. These estimators use the data to emulate the behavior and risk properties of the most effective estimator under consideration. Unbiased estimates of risk and sufficient conditions for minimaxity are provided. Bayesian motivations link this construction to posterior means of mixture priors. To illustrate the theory, minimax multiple shrinkage Stein estimators are constructed which can adaptively shrink the data towards any number of points or subspaces.

Article information

Source
Ann. Statist., Volume 14, Number 1 (1986), 188-205.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176349849

Digital Object Identifier
doi:10.1214/aos/1176349849

Mathematical Reviews number (MathSciNet)
MR829562

Zentralblatt MATH identifier
0602.62041

JSTOR
links.jstor.org

Subjects
Primary: 62F10: Point estimation
Secondary: 62F15: Bayesian inference

Keywords
Bayes estimator mixture multivariate normal mean Stein estimator superharmonic function unbiased estimate of risk

Citation

George, Edward I. Minimax Multiple Shrinkage Estimation. Ann. Statist. 14 (1986), no. 1, 188--205. doi:10.1214/aos/1176349849. https://projecteuclid.org/euclid.aos/1176349849


Export citation