Abstract
In a right-angled triangle, the hypotenuse is the longest side. So, if all (hypotenuse) vectors from a given set of vectors have the same orthogonal projection onto a certain subspace, we have a lower bound for their lengths. Interpreting the square of such a length as the variance of an unbiased estimator produces an information bound. The Cramér-Rao bound and the van Trees inequality can be seen as consequences of this bound. Another consequence is an inequality for the minimax variance, that is, the maximal variance in shrinking neighbourhoods, minimized over all unbiased estimators. This bound is non-asymptotic and requires almost no regularity conditions.
Citation
Andries Lenstra. "Cramér-Rao revisited." Bernoulli 11 (2) 263 - 282, April 2005. https://doi.org/10.3150/bj/1116340294
Information