The problem of estimating several normal mean vectors in an empirical Bayes situation is considered. In this case, it reduces to the problem of estimating the inverse of a covariance matrix in the standard multivariate normal situation using a particular loss function. Estimators which dominate any constant multiple of the inverse sample covariance matrix are presented. These estimators work by shrinking the sample eigenvalues toward a central value, in much the same way as the James-Stein estimator for a mean vector shrinks the maximum likelihood estimators toward a common value. These covariance estimators then lead to a class of multivariate estimators of the mean, each of which dominates the maximum likelihood estimator.
"Multivariate Empirical Bayes and Estimation of Covariance Matrices." Ann. Statist. 4 (1) 22 - 32, January, 1976. https://doi.org/10.1214/aos/1176343345