Abstract
This paper is concerned with parametric regression models of the form $Y_{ij} = f(t_{ij}, \theta_i) + \text{error}, i = 1, \ldots, n, j = 1, \ldots, T_i$, where the continuous function $f$ may depend nonlinearly on the known regressors $t_{ij}$ and the unknown parameter vectors $\theta_i$. The assumption of an a priori known $f$ is dropped and replaced by the requirement that qualitative information about the structure of the model is available or can be generated by a preliminary exploratory data analysis. This framework--allowing both $f$ and the individual parameter vectors to be unknown--necessitates a detailed discussion of identifiability of model and parameters. A method is then proposed for the simultaneous estimation of $f$ and $\theta_i$ by making use of the prior information. An iterative algorithm simplifying computation of the estimates is presented, and for $\min\{n, T_1, \ldots, T_n\} \rightarrow \infty$ conditions for strong uniform consistency of the resulting estimators of $f$ and strong consistency of the estimators of $\theta_i$ are established. Some examples illustrating the method are included.
Citation
Alois Kneip. Theo Gasser. "Convergence and Consistency Results for Self-Modeling Nonlinear Regression." Ann. Statist. 16 (1) 82 - 112, March, 1988. https://doi.org/10.1214/aos/1176350692
Information