This issuePrevious ArticleSome characterizations and applications on strongly $\alpha$-preinvex and strongly $\alpha$-invex functionsNext ArticleA selection problem for a constrained linear regression model
Guaranteed descent conjugate gradient methods with modified secant condition
Conjugate gradient methods are typically used to solve large scale
unconstrained optimization problems. Recently, Hager and Zhang
(2006) proposed two guaranteed descent conjugate gradient methods.
In this paper, following Hager and Zhang (2006), we will use the
modified secant condition given by Zhang et al.(1999) to present two
new descent conjugate gradient methods. An interesting feature of
these new methods is that they take both the gradient and function
value information. Under some suitable assumptions, global
convergence properties for these methods are established. Numerical
comparisons with the Hager-Zhang methods are given.