## The Annals of Mathematical Statistics

### Nonlinear Least Squares Estimation

#### Abstract

We are given a set of $N$ responses $Y_t$ which have arisen from a nonlinear regression model \begin{equation*}\tag{(1.1)}Y_t = f(x_t, \theta) + e_t; \quad t = 1, 2, \cdots, N.\end{equation*} Here $x_t$ denotes the $t$th fixed input vector of $k$ elements giving rise to $Y_t$, whilst $\theta$ is an $m$-element unknown parameter vector with elements $\theta_i$ and the $e_t$ are a set of $N$ independent error residuals from $N(0, \sigma^2)$ with $\sigma^2$ unknown. The expectations of the $Y_t$, are therefore the functions $f(x_t, \theta)$ which will be assumed to satisfy certain regularity conditions. The problem is to estimate $\theta$ notably by least squares. In this paper we shall develop an iterative method of solution of the least squares equations which has the following properties: (a) the computational procedure is convergent for finite $N$; (b) the resulting estimators are asymptotically $100{\tt\#}$ efficient as $N \rightarrow \infty$. In Sections 2-4 we give a survey of our results leaving the mathematical proofs to Sections 5-7 whilst in Section 8 we illustrate our method with an example. Although our theoretical development is oriented towards our specific goals certain results are proved in a somewhat more general form. Some of our theory will be seen to correspond to well known theorems on stochastic limits which have to be reproved because of certain modifications which we require.

#### Article information

Source
Ann. Math. Statist., Volume 36, Number 2 (1965), 638-650.

Dates
First available in Project Euclid: 27 April 2007

https://projecteuclid.org/euclid.aoms/1177700171

Digital Object Identifier
doi:10.1214/aoms/1177700171

Mathematical Reviews number (MathSciNet)
MR174114

Zentralblatt MATH identifier
0141.34506

JSTOR