Abstract
Let $\hat{T}_n$ be an estimate of the form $\hat{T}_n = T(\hat{F}_n)$, where $\hat{F}_n$ is the sample $\operatorname{cdf}$ of $n \operatorname{iid}$ observations and $T$ is a locally quadratic functional defined on $\operatorname{cdf's}$. Then, the normalized jackknife estimates for bias, skewness, and variance of $\hat{T}_n$ approximate closely their bootstrap counterparts. Each of these estimates is consistent. Moreover, the jackknife and bootstrap estimates of variance are asymptotically normal and asymptotically minimax. The main results: the first-order Edgeworth expansion estimate for the distribution of $n^{1/2}(\hat{T}_n - T(F))$, with $F$ being the actual $\operatorname{cdf}$ of each observation and the expansion coefficients being estimated by jackknifing, is asymptotically equivalent to the corresponding bootstrap distribution estimate, up to and including terms of order $n^{-1/2}$. Both distribution estimates are asymptotically minimax. The jackknife Edgeworth expansion estimate suggests useful corrections for skewness and bias to upper and lower confidence bounds for $T(F)$.
Citation
Rudolf Beran. "Jackknife Approximations to Bootstrap Estimates." Ann. Statist. 12 (1) 101 - 118, March, 1984. https://doi.org/10.1214/aos/1176346395
Information