Open Access
August, 1966 A Problem in Minimax Variance Polynomial Extrapolation
A. Levine
Ann. Math. Statist. 37(4): 898-903 (August, 1966). DOI: 10.1214/aoms/1177699371


For the problem of optimum prediction by means of $k$th degree polynomial regression, it is shown in [3] how to find the observation points and respective proportions of observations in the interval $\lbrack-1, 1\rbrack$ in order to obtain the minimax variance over the interval $\lbrack -1, t\brack$ of the predicted regression value for all $t \geqq t_1 > 1; t_1$ is the point outside the interval of observations at which the Chebyschev polynomial of degree $k$ is equal to the maximum value of the variance of the least squares estimate in $\lbrack -1, 1\rbrack$. It is shown herein that if the observation points and proportions are chosen as specified in [3], then the maximum of the "least squares" variance in the interval $\lbrack -1, 1\rbrack$ is at -1. As a consequence, an equation is developed which permits the evaluation of $t_1$ as a function of $k$. Moreover, it is shown that $t_1 \rightarrow 1$ as $k \rightarrow \infty$, so that, for large $k$, the solution given in [3] yields an approximation to the minimax variance over the interval $\lbrack -1, t \rbrack$, all $t > 1$.


Download Citation

A. Levine. "A Problem in Minimax Variance Polynomial Extrapolation." Ann. Math. Statist. 37 (4) 898 - 903, August, 1966.


Published: August, 1966
First available in Project Euclid: 27 April 2007

zbMATH: 0147.37306
MathSciNet: MR195215
Digital Object Identifier: 10.1214/aoms/1177699371

Rights: Copyright © 1966 Institute of Mathematical Statistics

Vol.37 • No. 4 • August, 1966
Back to Top