Open Access
august 2014 A modified scaled conjugate gradient method with global convergence for nonconvex functions
Saman Babaie-Kafaki, Reza Ghanbari
Bull. Belg. Math. Soc. Simon Stevin 21(3): 465-477 (august 2014). DOI: 10.36045/bbms/1407765884

Abstract

Following Andrei's approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without convexity assumption on the objective function. Furthermore, for uniformly convex objective functions, sufficient descent property of the method is established based on an eigenvalue analysis. Numerical experiments are employed to demonstrate the efficiency of the method.

Citation

Download Citation

Saman Babaie-Kafaki. Reza Ghanbari. "A modified scaled conjugate gradient method with global convergence for nonconvex functions." Bull. Belg. Math. Soc. Simon Stevin 21 (3) 465 - 477, august 2014. https://doi.org/10.36045/bbms/1407765884

Information

Published: august 2014
First available in Project Euclid: 11 August 2014

zbMATH: 1305.90379
MathSciNet: MR3250773
Digital Object Identifier: 10.36045/bbms/1407765884

Subjects:
Primary: 15A18‎ , 49M37 , 65K05 , 90C53

Keywords: Conjugate gradient algorithm , Descent condition , global convergence , Secant equation , Unconstrained optimization

Rights: Copyright © 2014 The Belgian Mathematical Society

Vol.21 • No. 3 • august 2014
Back to Top