The Annals of Statistics

An Optimum Design for Estimating the First Derivative

Roy V. Erickson, Vaclav Fabian, and Jan Marik

Full-text: Open access


An optimum design of experiment for a class of estimates of the first derivative at 0 (used in stochastic approximation and density estimation) is shown to be equivalent to the problem of finding a point of minimum of the function $\Gamma$ defined by $\Gamma (x) = \det\lbrack 1, x^3,\ldots, x^{2m-1} \rbrack/\det\lbrack x, x^3,\ldots, x^{2m-1} \rbrack$ on the set of all $m$-dimensional vectors with components satisfying $0 < x_1 < -x_2 < \cdots < (-1)^{m-1} x_m$ and $\Pi|x_i| = 1$. (In the determinants, 1 is the column vector with all components 1, and $x^i$ has components of $x$ raised to the $i$-th power.) The minimum of $\Gamma$ is shown to be $m$, and the point at which the minimum is attained is characterized by Chebyshev polynomials of the second kind.

Article information

Ann. Statist., Volume 23, Number 4 (1995), 1234-1247.

First available in Project Euclid: 11 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62K05: Optimal designs
Secondary: 62L20: Stochastic approximation 15A15: Determinants, permanents, other special matrix functions [See also 19B10, 19B14]

Stochastic approximation determinants linear independence orthogonal polynomials Chebyshev polynomials of second kind


Erickson, Roy V.; Fabian, Vaclav; Marik, Jan. An Optimum Design for Estimating the First Derivative. Ann. Statist. 23 (1995), no. 4, 1234--1247. doi:10.1214/aos/1176324707.

Export citation