Standard large-sample maximum likelihood and Bayesian inference, based on limiting multivariate normal distributions, may be dubious when applied with small or moderate sample sizes. We define and discuss several measures of nonnormality of MLE and posterior distributions that may be used as diagnostics and can indicate whether reparameterization will be effective in improving inferences. We begin by showing how the nonlinearity measures introduced by Beale and Bates and Watts for nonlinear regression may be generalized to exponential family nonlinear models. We replace the exponential family regression surface with another surface defined in terms of the parameterization in which the third derivatives of the loglikelihood function vanish at the MLE, and then we compute "curvatures" of the latter surface. This generalization effectively replaces the normal-theory Euclidean geometry with an $\alpha$-connection geometry of Amari identified by Kass, yet it may be understood and implemented without reference to that foundational argument. We also discuss alternative diagnostics based on the observed third derivatives of the loglikelihood function, or the third derivatives of a $\log$ posterior density. These may be viewed as multiparameter generalizations of a nonnormality measure proposed by Sprott. We show how one of these diagnostics may be quickly and easily computed using approximations of Tierney, Kass and Kadane.
"Some Diagnostics of Maximum Likelihood and Posterior Nonnormality." Ann. Statist. 22 (2) 668 - 695, June, 1994. https://doi.org/10.1214/aos/1176325490