Abstract
We study the iterative subgradient methods for nonsmooth convex constrained optimization problems in a uniformly convex and uniformly smooth Banach space, followed by metric and generalized projections onto the feasible sets. The normalized stepsizes $\alpha_n$ are chosen {\em apriori}, satisfying the conditions $\sum_{n=0}^\infty\alpha_n=\infty$, $\alpha_n \to 0.$ We prove that the every sequence generated in this way is weakly convergent to a minimizer in the average if the problem has solutions. In addition, we show that the perturbed $\epsilon_n$-subgradient method is stable when $\epsilon_n \to 0.$ More general case of variational inequalities with monotone (possibly) nonpotential operators is also considered.
Citation
Ya. I. Alber. "ON AVERAGE CONVERGENCE OF THE ITERATIVE PROJECTION METHODS." Taiwanese J. Math. 6 (3) 323 - 341, 2002. https://doi.org/10.11650/twjm/1500558299
Information