Abstract
Minimum discriminant information adjustment has primarily been used in the analysis of multinomial data; however, no such restriction is necessary. Let $P$ be a distribution on $R^a$, and let $\mathscr{C}$ be a convex set of distributions on $R^a$. Let $\mathbf{X}_i, 1 \leq i \leq n$, be independent and identically distributed observations with common distribution $P$. The minimum discriminant information adjustment (MDIA) of $P$ relative to $\mathscr{C}$ is the element $Q$ of $\mathscr{C}$ that is closest to $P$ in the sense of Kullback-Leibler discriminant information. If $\bar{P}_n$ is the empirical distribution of the $X_i, 1 \leq i \leq n$, and $\bar{Q}_n$ is the MDIA of $\bar{P}_n$ relative to $\mathscr{C}$, then $\bar{Q}_n$ is the maximum likelihood estimate in $\mathscr{C}$. Let $\mathscr{C}$ consist of distributions $A$ on $R^a$ such that $\int T dA = t$, where $T$ is a measurable transformation from $R^a$ to $R^b$ and $t \in R^b$. It is shown that under mild regularity conditions $\bar{Q}_n$ converges weakly to $Q$, the MDIA of the true $P$, with probability 1 and that $\bar{E}_n(D) = \int Dd\bar{Q}_n$ is an asymptotically normal and asymptotically unbiased estimate of $E(D) = \int D dQ$.
Citation
Shelby J. Haberman. "Adjustment by Minimum Discriminant Information." Ann. Statist. 12 (3) 971 - 988, September, 1984. https://doi.org/10.1214/aos/1176346715
Information