October  2008, 4(4): 739-755. doi: 10.3934/jimo.2008.4.739

Guaranteed descent conjugate gradient methods with modified secant condition

1. 

Department of Mathematics, Zhejiang University, Hangzhou 310027, China, China

Received  January 2007 Revised  July 2008 Published  October 2008

Conjugate gradient methods are typically used to solve large scale unconstrained optimization problems. Recently, Hager and Zhang (2006) proposed two guaranteed descent conjugate gradient methods. In this paper, following Hager and Zhang (2006), we will use the modified secant condition given by Zhang et al.(1999) to present two new descent conjugate gradient methods. An interesting feature of these new methods is that they take both the gradient and function value information. Under some suitable assumptions, global convergence properties for these methods are established. Numerical comparisons with the Hager-Zhang methods are given.
Citation: Shishun Li, Zhengda Huang. Guaranteed descent conjugate gradient methods with modified secant condition. Journal of Industrial & Management Optimization, 2008, 4 (4) : 739-755. doi: 10.3934/jimo.2008.4.739
[1]

Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034

[2]

Zhong Wan, Chaoming Hu, Zhanlu Yang. A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search. Discrete & Continuous Dynamical Systems - B, 2011, 16 (4) : 1157-1169. doi: 10.3934/dcdsb.2011.16.1157

[3]

Wataru Nakamura, Yasushi Narushima, Hiroshi Yabe. Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (3) : 595-619. doi: 10.3934/jimo.2013.9.595

[4]

C.Y. Wang, M.X. Li. Convergence property of the Fletcher-Reeves conjugate gradient method with errors. Journal of Industrial & Management Optimization, 2005, 1 (2) : 193-200. doi: 10.3934/jimo.2005.1.193

[5]

El-Sayed M.E. Mostafa. A nonlinear conjugate gradient method for a special class of matrix optimization problems. Journal of Industrial & Management Optimization, 2014, 10 (3) : 883-903. doi: 10.3934/jimo.2014.10.883

[6]

Gaohang Yu, Lutai Guan, Guoyin Li. Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property. Journal of Industrial & Management Optimization, 2008, 4 (3) : 565-579. doi: 10.3934/jimo.2008.4.565

[7]

Nam-Yong Lee, Bradley J. Lucier. Preconditioned conjugate gradient method for boundary artifact-free image deblurring. Inverse Problems & Imaging, 2016, 10 (1) : 195-225. doi: 10.3934/ipi.2016.10.195

[8]

Xing Li, Chungen Shen, Lei-Hong Zhang. A projected preconditioned conjugate gradient method for the linear response eigenvalue problem. Numerical Algebra, Control & Optimization, 2018, 8 (4) : 389-412. doi: 10.3934/naco.2018025

[9]

Jun Chen, Wenyu Sun, Zhenghao Yang. A non-monotone retrospective trust-region method for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 919-944. doi: 10.3934/jimo.2013.9.919

[10]

Lijuan Zhao, Wenyu Sun. Nonmonotone retrospective conic trust region method for unconstrained optimization. Numerical Algebra, Control & Optimization, 2013, 3 (2) : 309-325. doi: 10.3934/naco.2013.3.309

[11]

Nora Merabet. Global convergence of a memory gradient method with closed-form step size formula. Conference Publications, 2007, 2007 (Special) : 721-730. doi: 10.3934/proc.2007.2007.721

[12]

Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control & Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024

[13]

Stefan Kindermann. Convergence of the gradient method for ill-posed problems. Inverse Problems & Imaging, 2017, 11 (4) : 703-720. doi: 10.3934/ipi.2017033

[14]

Matthias Gerdts, Stefan Horn, Sven-Joachim Kimmerle. Line search globalization of a semismooth Newton method for operator equations in Hilbert spaces with applications in optimal control. Journal of Industrial & Management Optimization, 2017, 13 (1) : 47-62. doi: 10.3934/jimo.2016003

[15]

Yigui Ou, Xin Zhou. A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization. Journal of Industrial & Management Optimization, 2018, 14 (2) : 785-801. doi: 10.3934/jimo.2017075

[16]

Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial & Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038

[17]

Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial & Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411

[18]

Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-16. doi: 10.3934/jimo.2018149

[19]

Yu-Ning Yang, Su Zhang. On linear convergence of projected gradient method for a class of affine rank minimization problems. Journal of Industrial & Management Optimization, 2016, 12 (4) : 1507-1519. doi: 10.3934/jimo.2016.12.1507

[20]

Mingyong Lai, Xiaojiao Tong. A metaheuristic method for vehicle routing problem based on improved ant colony optimization and Tabu search. Journal of Industrial & Management Optimization, 2012, 8 (2) : 469-484. doi: 10.3934/jimo.2012.8.469

2018 Impact Factor: 1.025

Metrics

  • PDF downloads (14)
  • HTML views (0)
  • Cited by (2)

Other articles
by authors

[Back to Top]