Open Access
December 2002 A unified jackknife theory for empirical best prediction with M-estimation
Jiming Jiang, P. Lahiri, Shu-Mei Wan
Ann. Statist. 30(6): 1782-1810 (December 2002). DOI: 10.1214/aos/1043351257

Abstract

The paper presents a unified jackknife theory for a fairly general class of mixed models which includes some of the widely used mixed linear models and generalized linear mixed models as special cases. The paper develops jackknife theory for the important, but so far neglected, prediction problem for the general mixed model. For estimation of fixed parameters, a jackknife method is considered for a general class of M-estimators which includes the maximum likelihood, residual maximum likelihood and ANOVA estimators for mixed linear models and the recently developed method of simulated moments estimators for generalized linear mixed models. For both the prediction and estimation problems, a jackknife method is used to obtain estimators of the mean squared errors (MSE). Asymptotic unbiasedness of the MSE estimators is shown to hold essentially under certain moment conditions. Simulation studies undertaken support our theoretical results.

Citation

Download Citation

Jiming Jiang. P. Lahiri. Shu-Mei Wan. "A unified jackknife theory for empirical best prediction with M-estimation." Ann. Statist. 30 (6) 1782 - 1810, December 2002. https://doi.org/10.1214/aos/1043351257

Information

Published: December 2002
First available in Project Euclid: 23 January 2003

zbMATH: 1020.62025
MathSciNet: MR1969450
Digital Object Identifier: 10.1214/aos/1043351257

Subjects:
Primary: 62D05 , 62G09

Keywords: $M$-estimators , Empirical best predictors , mean squared errors , mixed linear models , mixed logistic models , small-area estimation , uniform consistency , variance components

Rights: Copyright © 2002 Institute of Mathematical Statistics

Vol.30 • No. 6 • December 2002
Back to Top