## The Annals of Statistics

### Estimation up to a Change-Point

#### Abstract

Consider the problem of estimating $\mu$, based on the observation of $Y_0, Y_1, \ldots, Y_n$, where it is assumed only that $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for some unknown $\kappa$. Unlike the traditional change-point problem, the focus here is not on estimating $\kappa$, which is now a nuisance parameter. When it is known that $\kappa = k$, the sample mean $\bar{Y}_k = \sum^k_0Y_i/(k + 1)$, provides, in addition to wonderful efficiency properties, safety in the sense that it is minimax under squared error loss. Unfortunately, this safety breaks down when $\kappa$ is unknown; indeed if $k > \kappa$, the risk of $\bar{Y}_k$ is unbounded. To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for each possible value of $\kappa$. An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.

#### Article information

Source
Ann. Statist., Volume 21, Number 2 (1993), 625-644.

Dates
First available in Project Euclid: 12 April 2007

https://projecteuclid.org/euclid.aos/1176349141

Digital Object Identifier
doi:10.1214/aos/1176349141

Mathematical Reviews number (MathSciNet)
MR1232509

Zentralblatt MATH identifier
0779.62018

JSTOR