Open Access
June, 1993 Estimation up to a Change-Point
Dean P. Foster, Edward I. George
Ann. Statist. 21(2): 625-644 (June, 1993). DOI: 10.1214/aos/1176349141

Abstract

Consider the problem of estimating $\mu$, based on the observation of $Y_0, Y_1, \ldots, Y_n$, where it is assumed only that $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for some unknown $\kappa$. Unlike the traditional change-point problem, the focus here is not on estimating $\kappa$, which is now a nuisance parameter. When it is known that $\kappa = k$, the sample mean $\bar{Y}_k = \sum^k_0Y_i/(k + 1)$, provides, in addition to wonderful efficiency properties, safety in the sense that it is minimax under squared error loss. Unfortunately, this safety breaks down when $\kappa$ is unknown; indeed if $k > \kappa$, the risk of $\bar{Y}_k$ is unbounded. To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for each possible value of $\kappa$. An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.

Citation

Download Citation

Dean P. Foster. Edward I. George. "Estimation up to a Change-Point." Ann. Statist. 21 (2) 625 - 644, June, 1993. https://doi.org/10.1214/aos/1176349141

Information

Published: June, 1993
First available in Project Euclid: 12 April 2007

zbMATH: 0779.62018
MathSciNet: MR1232509
Digital Object Identifier: 10.1214/aos/1176349141

Subjects:
Primary: 62F10
Secondary: 62C20 , 62L12

Keywords: Change-point problems , Equivariance , Hunt-Stein theorem , minimax procedures , pooling data , risk

Rights: Copyright © 1993 Institute of Mathematical Statistics

Vol.21 • No. 2 • June, 1993
Back to Top