The parametric statistical models with suitable regularity conditions have a natural Riemannian manifold structure, given by the information metric. Since the parameters are merely labels for the probability measures, an inferential statement should be formulated through intrinsic objects, invariant under reparametrizations. In this context the estimators will be random objects valued on the manifold corresponding to the statistical model. In spite of these considerations, classical measures of an estimator's performance, like the bias and the mean square error, are clearly dependent on the statistical model parametrizations. In this paper the authors work with extended notions of mean value and moments of random objects which take values on a Hausdorff and connected manifold, equipped with an affine connection. In particular, the Riemannian manifold case is considered. This extension is applied to the bias and the mean square error study in statistical point estimation theory. Under this approach an intrinsic version of the Cramer-Rao lower bound is obtained: a lower bound, which depends on the intrinsic bias and the curvature of the statistical model, for the mean square of the Rao distance, the invariant measure analogous to the mean square error. Further, the behavior of the mean square of the Rao distance of an estimator when conditioning with respect to a sufficient statistic is considered, obtaining intrinsic versions of the Rao-Blackwell and Lehmann-Scheffe theorems. Asymptotic properties complete the study.
"Intrinsic Analysis of Statistical Estimation." Ann. Statist. 23 (5) 1562 - 1581, October, 1995. https://doi.org/10.1214/aos/1176324312