Open Access
April, 1965 Nonlinear Least Squares Estimation
H. O. Hartley, Aaron Booker
Ann. Math. Statist. 36(2): 638-650 (April, 1965). DOI: 10.1214/aoms/1177700171


We are given a set of $N$ responses $Y_t$ which have arisen from a nonlinear regression model \begin{equation*}\tag{(1.1)}Y_t = f(x_t, \theta) + e_t; \quad t = 1, 2, \cdots, N.\end{equation*} Here $x_t$ denotes the $t$th fixed input vector of $k$ elements giving rise to $Y_t$, whilst $\theta$ is an $m$-element unknown parameter vector with elements $\theta_i$ and the $e_t$ are a set of $N$ independent error residuals from $N(0, \sigma^2)$ with $\sigma^2$ unknown. The expectations of the $Y_t$, are therefore the functions $f(x_t, \theta)$ which will be assumed to satisfy certain regularity conditions. The problem is to estimate $\theta$ notably by least squares. In this paper we shall develop an iterative method of solution of the least squares equations which has the following properties: (a) the computational procedure is convergent for finite $N$; (b) the resulting estimators are asymptotically $100{\tt\#}$ efficient as $N \rightarrow \infty$. In Sections 2-4 we give a survey of our results leaving the mathematical proofs to Sections 5-7 whilst in Section 8 we illustrate our method with an example. Although our theoretical development is oriented towards our specific goals certain results are proved in a somewhat more general form. Some of our theory will be seen to correspond to well known theorems on stochastic limits which have to be reproved because of certain modifications which we require.


Download Citation

H. O. Hartley. Aaron Booker. "Nonlinear Least Squares Estimation." Ann. Math. Statist. 36 (2) 638 - 650, April, 1965.


Published: April, 1965
First available in Project Euclid: 27 April 2007

zbMATH: 0141.34506
MathSciNet: MR174114
Digital Object Identifier: 10.1214/aoms/1177700171

Rights: Copyright © 1965 Institute of Mathematical Statistics

Vol.36 • No. 2 • April, 1965
Back to Top