Taiwanese Journal of Mathematics

ON AVERAGE CONVERGENCE OF THE ITERATIVE PROJECTION METHODS

Ya. I. Alber

Full-text: Open access

Abstract

We study the iterative subgradient methods for nonsmooth convex constrained optimization problems in a uniformly convex and uniformly smooth Banach space, followed by metric and generalized projections onto the feasible sets. The normalized stepsizes $\alpha_n$ are chosen {\em apriori}, satisfying the conditions $\sum_{n=0}^\infty\alpha_n=\infty$, $\alpha_n \to 0.$ We prove that the every sequence generated in this way is weakly convergent to a minimizer in the average if the problem has solutions. In addition, we show that the perturbed $\epsilon_n$-subgradient method is stable when $\epsilon_n \to 0.$ More general case of variational inequalities with monotone (possibly) nonpotential operators is also considered.

Article information

Source
Taiwanese J. Math., Volume 6, Number 3 (2002), 323-341.

Dates
First available in Project Euclid: 20 July 2017

Permanent link to this document
https://projecteuclid.org/euclid.twjm/1500558299

Digital Object Identifier
doi:10.11650/twjm/1500558299

Mathematical Reviews number (MathSciNet)
MR1921596

Zentralblatt MATH identifier
1021.90041

Subjects
Primary: 90C25: Convex programming 90C30: Nonlinear programming 49J40: Variational methods including variational inequalities [See also 47J20]

Keywords
iterative method nonsmooth convex functional variational inequality subgradient $\epsilon$-subgradient duality mapping generalized projection Lyapunov functionals young-Fenchel transformation Ces\`aro averages convergence stability

Citation

Alber, Ya. I. ON AVERAGE CONVERGENCE OF THE ITERATIVE PROJECTION METHODS. Taiwanese J. Math. 6 (2002), no. 3, 323--341. doi:10.11650/twjm/1500558299. https://projecteuclid.org/euclid.twjm/1500558299


Export citation