The Annals of Statistics

Estimation up to a Change-Point

Dean P. Foster and Edward I. George

Full-text: Open access


Consider the problem of estimating $\mu$, based on the observation of $Y_0, Y_1, \ldots, Y_n$, where it is assumed only that $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for some unknown $\kappa$. Unlike the traditional change-point problem, the focus here is not on estimating $\kappa$, which is now a nuisance parameter. When it is known that $\kappa = k$, the sample mean $\bar{Y}_k = \sum^k_0Y_i/(k + 1)$, provides, in addition to wonderful efficiency properties, safety in the sense that it is minimax under squared error loss. Unfortunately, this safety breaks down when $\kappa$ is unknown; indeed if $k > \kappa$, the risk of $\bar{Y}_k$ is unbounded. To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for each possible value of $\kappa$. An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.

Article information

Ann. Statist., Volume 21, Number 2 (1993), 625-644.

First available in Project Euclid: 12 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62F10: Point estimation
Secondary: 62C20: Minimax procedures 62L12: Sequential estimation

Change-point problems equivariance Hunt-Stein theorem minimax procedures risk pooling data


Foster, Dean P.; George, Edward I. Estimation up to a Change-Point. Ann. Statist. 21 (1993), no. 2, 625--644. doi:10.1214/aos/1176349141.

Export citation