A New Family of Globally Convergent Conjugate Gradient Methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 87763
A New Family of Globally Convergent Conjugate Gradient Methods

Authors: B. Sellami, Y. Laskri, M. Belloufi

Abstract:

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, a new family of conjugate gradient method is proposed for unconstrained optimization. This method includes the already existing two practical nonlinear conjugate gradient methods, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the Wolfe conditions. The numerical experiments are done to test the efficiency of the new method, which implies the new method is promising. In addition the methods related to this family are uniformly discussed.

Keywords: conjugate gradient method, global convergence, line search, unconstrained optimization

Procedia PDF Downloads 410