Advanced Search
Article Contents
Article Contents

Guaranteed descent conjugate gradient methods with modified secant condition

Abstract Related Papers Cited by
  • Conjugate gradient methods are typically used to solve large scale unconstrained optimization problems. Recently, Hager and Zhang (2006) proposed two guaranteed descent conjugate gradient methods. In this paper, following Hager and Zhang (2006), we will use the modified secant condition given by Zhang et al.(1999) to present two new descent conjugate gradient methods. An interesting feature of these new methods is that they take both the gradient and function value information. Under some suitable assumptions, global convergence properties for these methods are established. Numerical comparisons with the Hager-Zhang methods are given.
    Mathematics Subject Classification: Primary: 65K05, 90C06, 90C30; Secondary: 49D27.


    \begin{equation} \\ \end{equation}
  • 加载中

Article Metrics

HTML views() PDF downloads(80) Cited by(0)

Access History

Other Articles By Authors



    DownLoad:  Full-Size Img  PowerPoint