Abstract
Let $X_i$ and $Y_i$ be random variables related to other random variables $U_i, V_i$, and $W_i$ as follows: $X_i = U_i + W_i, Y_i = \alpha + \beta U_i + V_i, i = 1, \cdots, n$, where $\alpha$ and $\beta$ are finite constants. Here $X_i$ and $Y_i$ are observable while $U_i, V_i$ and $W_i$ are not. This model is customarily referred to as the regression problem with errors in both variables and the central question is the estimation of $\beta$. We give a class of estimates for $\beta$ which are asymptotically normal with mean $\beta$ and variance proportional to $1/n^{\frac{1}{2}}$, under weak assumptions. We then show how to choose a good estimate of $\beta$ from this class.
Citation
Clifford Spiegelman. "On Estimating the Slope of a Straight Line when Both Variables are Subject to Error." Ann. Statist. 7 (1) 201 - 206, January, 1979. https://doi.org/10.1214/aos/1176344565
Information