Open Access
June, 1954 Approximation Methods which Converge with Probability one
Julius R. Blum
Ann. Math. Statist. 25(2): 382-386 (June, 1954). DOI: 10.1214/aoms/1177728794


Let $H(y\mid x)$ be a family of distribution functions depending upon a real parameter $x,$ and let $M(x) = \int^\infty_{-\infty} y dH(y \mid x)$ be the corresponding regression function. It is assumed $M(x)$ is unknown to the experimenter, who is, however, allowed to take observations on $H(y\mid x)$ for any value $x.$ Robbins and Monro [1] give a method for defining successively a sequence $\{x_n\}$ such that $x_n$ converges to $\theta$ in probability, where $\theta$ is a root of the equation $M(x) = \alpha$ and $\alpha$ is a given number. Wolfowitz [2] generalizes these results, and Kiefer and Wolfowitz [3], solve a similar problem in the case when $M(x)$ has a maximum at $x = \theta.$ Using a lemma due to Loeve [4], we show that in both cases $x_n$ converges to $\theta$ with probability one, under weaker conditions than those imposed in [2] and [3]. Further we solve a similar problem in the case when $M(x)$ is the median of $H(y \mid x).$


Download Citation

Julius R. Blum. "Approximation Methods which Converge with Probability one." Ann. Math. Statist. 25 (2) 382 - 386, June, 1954.


Published: June, 1954
First available in Project Euclid: 28 April 2007

zbMATH: 0055.37806
MathSciNet: MR62399
Digital Object Identifier: 10.1214/aoms/1177728794

Rights: Copyright © 1954 Institute of Mathematical Statistics

Vol.25 • No. 2 • June, 1954
Back to Top