Abstract
We consider the problem of estimating a vector ${\boldsymbol{\mu}}=(\mu_{1},\dots,\mu_{n})$ under a squared loss, based on independent observations $Y_{i}\sim N(\mu_{i},1)$, $i=1,\dots,n$, and possibly extra structural assumptions. We argue that many estimators are asymptotically equal to $\hat{\mu}_{i}=\alpha\tilde{\mu}_{i}+(1-\alpha)Y_{i}+\xi_{i}=\tilde{\mu}_{i}+(1-\alpha)(Y_{i}-\tilde{\mu}_{i})+\xi_{i}$, where $\alpha\in[0,1]$ and $\tilde{\mu}_{i}$ may depend on the data, but is not a function of $Y_{i}$, and $\sum\xi_{i}^{2}=o_{p}(n)$.
We consider the optimal estimator of the form $\tilde{\mu}_{i}+g(Y_{i}-\tilde{\mu}_{i})$ for a general, possibly random, function $g$, and approximate it using nonparametric empirical Bayes ideas and techniques. We consider both the retrospective and the sequential estimation problems. We elaborate and demonstrate our results on the case where $\hat{\mu}_{i}$ are Kalman filter estimators. Simulations and a real data analysis are also provided.
Citation
Eitan Greenshtein. Ariel Mantzura. Ya’acov Ritov. "Nonparametric empirical Bayes improvement of shrinkage estimators with applications to time series." Bernoulli 25 (4B) 3459 - 3478, November 2019. https://doi.org/10.3150/18-BEJ1096
Information