### An Ancillarity Paradox Which Appears in Multiple Linear Regression

Lawrence D. Brown
Source: Ann. Statist. Volume 18, Number 2 (1990), 471-493.

#### Abstract

Consider a multiple linear regression in which $Y_i, i = 1, \cdots, n$, are independent normal variables with variance $\sigma^2$ and $E(Y_i) = \alpha + V'_i\beta$, where $V_i \in \mathbb{R}^r$ and $\beta \in \mathbb{R}^r.$ Let $\hat{\alpha}$ denote the usual least squares estimator of $\alpha$. Suppose that $V_i$ are themselves observations of independent multivariate normal random variables with mean 0 and known, nonsingular covariance matrix $\theta$. Then $\hat{\alpha}$ is admissible under squared error loss if $r \geq 2$. Several estimators dominating $\hat{\alpha}$ when $r \geq 3$ are presented. Analogous results are presented for the case where $\sigma^2$ or $\theta$ are unknown and some other generalizations are also considered. It is noted that some of these results for $r \geq 3$ appear in earlier papers of Baranchik and of Takada. $\{V_i\}$ are ancillary statistics in the above setting. Hence admissibility of $\hat{\alpha}$ depends on the distribution of the ancillary statistics, since if $\{V_i\}$ is fixed instead of random, then $\hat{\alpha}$ is admissible. This fact contradicts a widely held notion about ancillary statistics; some interpretations and consequences of this paradox are briefly discussed.

First Page:
Primary Subjects: 62C15
Secondary Subjects: 62C20, 62F10, 62A99, 62H12, 62J05
Full-text: Open access

Permanent link to this document: http://projecteuclid.org/euclid.aos/1176347602
Digital Object Identifier: doi:10.1214/aos/1176347602
Mathematical Reviews number (MathSciNet): MR1056325
Zentralblatt MATH identifier: 0721.62011