Annals of Statistics
- Ann. Statist.
- Volume 30, Number 6 (2002), 1782-1810.
A unified jackknife theory for empirical best prediction with M-estimation
Jiming Jiang, P. Lahiri, and Shu-Mei Wan
Abstract
The paper presents a unified jackknife theory for a fairly general class of mixed models which includes some of the widely used mixed linear models and generalized linear mixed models as special cases. The paper develops jackknife theory for the important, but so far neglected, prediction problem for the general mixed model. For estimation of fixed parameters, a jackknife method is considered for a general class of M-estimators which includes the maximum likelihood, residual maximum likelihood and ANOVA estimators for mixed linear models and the recently developed method of simulated moments estimators for generalized linear mixed models. For both the prediction and estimation problems, a jackknife method is used to obtain estimators of the mean squared errors (MSE). Asymptotic unbiasedness of the MSE estimators is shown to hold essentially under certain moment conditions. Simulation studies undertaken support our theoretical results.
Article information
Source
Ann. Statist., Volume 30, Number 6 (2002), 1782-1810.
Dates
First available in Project Euclid: 23 January 2003
Permanent link to this document
https://projecteuclid.org/euclid.aos/1043351257
Digital Object Identifier
doi:10.1214/aos/1043351257
Mathematical Reviews number (MathSciNet)
MR1969450
Zentralblatt MATH identifier
1020.62025
Subjects
Primary: 62G09: Resampling methods 62D05: Sampling theory, sample surveys
Keywords
Empirical best predictors mean squared errors $M$-estimators mixed linear models mixed logistic models small-area estimation uniform consistency variance components
Citation
Jiang, Jiming; Lahiri, P.; Wan, Shu-Mei. A unified jackknife theory for empirical best prediction with M -estimation. Ann. Statist. 30 (2002), no. 6, 1782--1810. doi:10.1214/aos/1043351257. https://projecteuclid.org/euclid.aos/1043351257

