April  2017, 13(2): 649-658. doi: 10.3934/jimo.2016038

A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update

1. 

Department of Mathematics, Faculty of Mathematics, Statistics and Computer Science, Semnan University, P.O. Box: 35195–363, Semnan, Iran

2. 

Faculty of Mathematical Sciences, Ferdowsi University of Mashhad, P.O. Box: 9177948953, Mashhad, Iran

* Corresponding author: Saman Babaie–Kafaki

Received  November 2014 Revised  December 2015 Published  May 2016

Hybridizing the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Dai and Liao based on the scaled memoryless BFGS update, a one–parameter class of four–term conjugate gradient methods is proposed. It is shown that the suggested class of conjugate gradient methods possesses the sufficient descent property, without convexity assumption on the objective function. A brief global convergence analysis is made for uniformly convex objective functions. Results of numerical comparisons are reported. They demonstrate efficiency of a method of the proposed class in the sense of the Dolan–Moré performance profile.

Citation: Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial and Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038
References:
[1]

N. Andrei, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, European J. Oper. Res., 204 (2010), 410-420.  doi: 10.1016/j.ejor.2009.11.030.

[2]

S. Babaie–Kafaki, On the sufficient descent condition of the Hager–Zhang conjugate gradient methods, 4OR, 12 (2014), 285-292.  doi: 10.1007/s10288-014-0255-6.

[3]

S. Babaie–Kafaki and R. Ghanbari, A descent family of Dai–Liao conjugate gradient methods, Optim. Methods Softw., 29 (2014), 583-591.  doi: 10.1080/10556788.2013.833199.

[4]

S. Babaie–Kafaki and R. Ghanbari, Two modified three–term conjugate gradient methods with sufficient descent property, Optim. Lett., 8 (2014), 2285-2297.  doi: 10.1007/s11590-014-0736-8.

[5]

S. Babaie–Kafaki and R. Ghanbari, A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least–squares approach, Optim. Methods Softw., 30 (2015), 673-681.  doi: 10.1080/10556788.2014.966825.

[6]

S. Babaie–KafakiR. Ghanbari and N. Mahdavi–Amiri, Two new conjugate gradient methods based on modified secant equations, J. Comput. Appl. Math., 234 (2010), 1374-1386.  doi: 10.1016/j.cam.2010.01.052.

[7]

J. Barzilai and J. Borwein, Two–point stepsize gradient methods, IMA J. Numer. Anal., 8 (1988), 141-148.  doi: 10.1093/imanum/8.1.141.

[8]

C. Broyden, The convergence of a class of double–rank minimization algorithms. Ⅱ. The new algorithm, J. Inst. Math. Appl., 6 (1970), 222-231.  doi: 10.1093/imamat/6.3.222.

[9]

Y. Dai and C. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim., 23 (2013), 296-320.  doi: 10.1137/100813026.

[10]

Y. Dai and L. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2001), 87-101.  doi: 10.1007/s002450010019.

[11]

E. Dolan and J. Moré, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.  doi: 10.1007/s101070100263.

[12]

R. Fletcher, A new approach to variable metric algorithms, Comput. J., 13 (1970), 317-322.  doi: 10.1093/comjnl/13.3.317.

[13]

J. Ford and I. Moghrabi, Multi–step quasi–Newton methods for optimization, J. Comput. Appl. Math., 50 (1994), 305-323.  doi: 10.1016/0377-0427(94)90309-3.

[14]

J. FordY. Narushima and H. Yabe, Multi–step nonlinear conjugate gradient methods for unconstrained minimization, Comput. Optim. Appl., 40 (2008), 191-216.  doi: 10.1007/s10589-007-9087-z.

[15]

D. Goldfarb, A family of variable metric methods derived by variational means, Math. Comp., 24 (1970), 23-26.  doi: 10.1090/S0025-5718-1970-0258249-6.

[16]

N. GouldD. Orban and P. Toint, CUTEr: a constrained and unconstrained testing environment, revisited, ACM Trans. Math. Softw., 29 (2003), 353-372.  doi: 10.1145/962437.962438.

[17]

W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16 (2005), 170-192.  doi: 10.1137/030601880.

[18]

W. Hager and H. Zhang, Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw., 32 (2006), 113-137.  doi: 10.1145/1132973.1132979.

[19]

W. Hager and H. Zhang, A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2 (2006), 35-58. 

[20]

M. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), 409-436.  doi: 10.6028/jres.049.044.

[21]

D. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, J. Comput. Appl. Math., 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.

[22]

G. LiC. Tang and Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math., 202 (2007), 523-539.  doi: 10.1016/j.cam.2006.03.005.

[23]

D. Liu and J. Nocedal, On the limited memory BFGS method for large–scale optimization, Math. Program., 45 (1989), 503-528.  doi: 10.1007/BF01589116.

[24]

S. Oren, Self–scaling variable metric (SSVM) algorithms. Ⅱ. Implementation and experiments, Management Sci., 20 (1974), 863-874. 

[25]

S. Oren and D. Luenberger, Self–scaling variable metric (SSVM) algorithms. Ⅰ. Criteria and sufficient conditions for scaling a class of algorithms, Management Sci., 20 (1973/74), 845-862. 

[26]

S. Oren and E. Spedicato, Optimal conditioning of self–scaling variable metric algorithms, Math. Program., 10 (1976), 70-90.  doi: 10.1007/BF01580654.

[27]

D. Shanno, Conditioning of quasi–Newton methods for function minimization, Math. Comp., 24 (1970), 647-656.  doi: 10.1090/S0025-5718-1970-0274029-X.

[28]

K. SugikiY. Narushima and H. Yabe, Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, J. Optim. Theory Appl., 153 (2012), 733-757.  doi: 10.1007/s10957-011-9960-x.

[29]

W. Sun and Y. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006.

[30]

Z. WeiG. Li and L. Qi, New quasi–Newton methods for unconstrained optimization problems, Appl. Math. Comput., 175 (2006), 1156-1188.  doi: 10.1016/j.amc.2005.08.027.

[31]

H. Yabe and M. Takano, Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Comput. Optim. Appl., 28 (2004), 203-225.  doi: 10.1023/B:COAP.0000026885.81997.88.

[32]

Y. Yuan, A modified BFGS algorithm for unconstrained optimization, IMA J. Numer. Anal., 11 (1991), 325-332.  doi: 10.1093/imanum/11.3.325.

[33]

J. ZhangN. Deng and L. Chen, New quasi–Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl., 102 (1999), 147-167.  doi: 10.1023/A:1021898630001.

[34]

L. ZhangW. Zhou and D. Li, Some descent three–term conjugate gradient methods and their global convergence, Optim. Methods Softw., 22 (2007), 697-711.  doi: 10.1080/10556780701223293.

[35]

W. Zhou and L. Zhang, A nonlinear conjugate gradient method based on the MBFGS secant condition, Optim. Methods Softw., 21 (2006), 707-714.  doi: 10.1080/10556780500137041.

show all references

References:
[1]

N. Andrei, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, European J. Oper. Res., 204 (2010), 410-420.  doi: 10.1016/j.ejor.2009.11.030.

[2]

S. Babaie–Kafaki, On the sufficient descent condition of the Hager–Zhang conjugate gradient methods, 4OR, 12 (2014), 285-292.  doi: 10.1007/s10288-014-0255-6.

[3]

S. Babaie–Kafaki and R. Ghanbari, A descent family of Dai–Liao conjugate gradient methods, Optim. Methods Softw., 29 (2014), 583-591.  doi: 10.1080/10556788.2013.833199.

[4]

S. Babaie–Kafaki and R. Ghanbari, Two modified three–term conjugate gradient methods with sufficient descent property, Optim. Lett., 8 (2014), 2285-2297.  doi: 10.1007/s11590-014-0736-8.

[5]

S. Babaie–Kafaki and R. Ghanbari, A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least–squares approach, Optim. Methods Softw., 30 (2015), 673-681.  doi: 10.1080/10556788.2014.966825.

[6]

S. Babaie–KafakiR. Ghanbari and N. Mahdavi–Amiri, Two new conjugate gradient methods based on modified secant equations, J. Comput. Appl. Math., 234 (2010), 1374-1386.  doi: 10.1016/j.cam.2010.01.052.

[7]

J. Barzilai and J. Borwein, Two–point stepsize gradient methods, IMA J. Numer. Anal., 8 (1988), 141-148.  doi: 10.1093/imanum/8.1.141.

[8]

C. Broyden, The convergence of a class of double–rank minimization algorithms. Ⅱ. The new algorithm, J. Inst. Math. Appl., 6 (1970), 222-231.  doi: 10.1093/imamat/6.3.222.

[9]

Y. Dai and C. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim., 23 (2013), 296-320.  doi: 10.1137/100813026.

[10]

Y. Dai and L. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2001), 87-101.  doi: 10.1007/s002450010019.

[11]

E. Dolan and J. Moré, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.  doi: 10.1007/s101070100263.

[12]

R. Fletcher, A new approach to variable metric algorithms, Comput. J., 13 (1970), 317-322.  doi: 10.1093/comjnl/13.3.317.

[13]

J. Ford and I. Moghrabi, Multi–step quasi–Newton methods for optimization, J. Comput. Appl. Math., 50 (1994), 305-323.  doi: 10.1016/0377-0427(94)90309-3.

[14]

J. FordY. Narushima and H. Yabe, Multi–step nonlinear conjugate gradient methods for unconstrained minimization, Comput. Optim. Appl., 40 (2008), 191-216.  doi: 10.1007/s10589-007-9087-z.

[15]

D. Goldfarb, A family of variable metric methods derived by variational means, Math. Comp., 24 (1970), 23-26.  doi: 10.1090/S0025-5718-1970-0258249-6.

[16]

N. GouldD. Orban and P. Toint, CUTEr: a constrained and unconstrained testing environment, revisited, ACM Trans. Math. Softw., 29 (2003), 353-372.  doi: 10.1145/962437.962438.

[17]

W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16 (2005), 170-192.  doi: 10.1137/030601880.

[18]

W. Hager and H. Zhang, Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw., 32 (2006), 113-137.  doi: 10.1145/1132973.1132979.

[19]

W. Hager and H. Zhang, A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2 (2006), 35-58. 

[20]

M. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), 409-436.  doi: 10.6028/jres.049.044.

[21]

D. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, J. Comput. Appl. Math., 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.

[22]

G. LiC. Tang and Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math., 202 (2007), 523-539.  doi: 10.1016/j.cam.2006.03.005.

[23]

D. Liu and J. Nocedal, On the limited memory BFGS method for large–scale optimization, Math. Program., 45 (1989), 503-528.  doi: 10.1007/BF01589116.

[24]

S. Oren, Self–scaling variable metric (SSVM) algorithms. Ⅱ. Implementation and experiments, Management Sci., 20 (1974), 863-874. 

[25]

S. Oren and D. Luenberger, Self–scaling variable metric (SSVM) algorithms. Ⅰ. Criteria and sufficient conditions for scaling a class of algorithms, Management Sci., 20 (1973/74), 845-862. 

[26]

S. Oren and E. Spedicato, Optimal conditioning of self–scaling variable metric algorithms, Math. Program., 10 (1976), 70-90.  doi: 10.1007/BF01580654.

[27]

D. Shanno, Conditioning of quasi–Newton methods for function minimization, Math. Comp., 24 (1970), 647-656.  doi: 10.1090/S0025-5718-1970-0274029-X.

[28]

K. SugikiY. Narushima and H. Yabe, Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, J. Optim. Theory Appl., 153 (2012), 733-757.  doi: 10.1007/s10957-011-9960-x.

[29]

W. Sun and Y. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006.

[30]

Z. WeiG. Li and L. Qi, New quasi–Newton methods for unconstrained optimization problems, Appl. Math. Comput., 175 (2006), 1156-1188.  doi: 10.1016/j.amc.2005.08.027.

[31]

H. Yabe and M. Takano, Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Comput. Optim. Appl., 28 (2004), 203-225.  doi: 10.1023/B:COAP.0000026885.81997.88.

[32]

Y. Yuan, A modified BFGS algorithm for unconstrained optimization, IMA J. Numer. Anal., 11 (1991), 325-332.  doi: 10.1093/imanum/11.3.325.

[33]

J. ZhangN. Deng and L. Chen, New quasi–Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl., 102 (1999), 147-167.  doi: 10.1023/A:1021898630001.

[34]

L. ZhangW. Zhou and D. Li, Some descent three–term conjugate gradient methods and their global convergence, Optim. Methods Softw., 22 (2007), 697-711.  doi: 10.1080/10556780701223293.

[35]

W. Zhou and L. Zhang, A nonlinear conjugate gradient method based on the MBFGS secant condition, Optim. Methods Softw., 21 (2006), 707-714.  doi: 10.1080/10556780500137041.

Figure 1.  Total number of function and gradient evaluations performance profiles for NMDL1, NMDL2, NMDL3 and MDL
Figure 2.  CPU time performance profiles for NMDL1, NMDL2, NMDL3 and MDL
[1]

Wataru Nakamura, Yasushi Narushima, Hiroshi Yabe. Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. Journal of Industrial and Management Optimization, 2013, 9 (3) : 595-619. doi: 10.3934/jimo.2013.9.595

[2]

Hong Seng Sim, Chuei Yee Chen, Wah June Leong, Jiao Li. Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization. Journal of Industrial and Management Optimization, 2021  doi: 10.3934/jimo.2021143

[3]

Gaohang Yu, Lutai Guan, Guoyin Li. Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property. Journal of Industrial and Management Optimization, 2008, 4 (3) : 565-579. doi: 10.3934/jimo.2008.4.565

[4]

Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial and Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034

[5]

Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control and Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024

[6]

Yigui Ou, Xin Zhou. A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization. Journal of Industrial and Management Optimization, 2018, 14 (2) : 785-801. doi: 10.3934/jimo.2017075

[7]

C.Y. Wang, M.X. Li. Convergence property of the Fletcher-Reeves conjugate gradient method with errors. Journal of Industrial and Management Optimization, 2005, 1 (2) : 193-200. doi: 10.3934/jimo.2005.1.193

[8]

Rouhollah Tavakoli, Hongchao Zhang. A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numerical Algebra, Control and Optimization, 2012, 2 (2) : 395-412. doi: 10.3934/naco.2012.2.395

[9]

El-Sayed M.E. Mostafa. A nonlinear conjugate gradient method for a special class of matrix optimization problems. Journal of Industrial and Management Optimization, 2014, 10 (3) : 883-903. doi: 10.3934/jimo.2014.10.883

[10]

Feng Bao, Thomas Maier. Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems. Foundations of Data Science, 2020, 2 (1) : 1-17. doi: 10.3934/fods.2020001

[11]

Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial and Management Optimization, 2020, 16 (1) : 245-260. doi: 10.3934/jimo.2018149

[12]

Jun Chen, Wenyu Sun, Zhenghao Yang. A non-monotone retrospective trust-region method for unconstrained optimization. Journal of Industrial and Management Optimization, 2013, 9 (4) : 919-944. doi: 10.3934/jimo.2013.9.919

[13]

Lijuan Zhao, Wenyu Sun. Nonmonotone retrospective conic trust region method for unconstrained optimization. Numerical Algebra, Control and Optimization, 2013, 3 (2) : 309-325. doi: 10.3934/naco.2013.3.309

[14]

He Huang, Zhen He. A global optimization method for multiple response optimization problems. Journal of Industrial and Management Optimization, 2022  doi: 10.3934/jimo.2022016

[15]

Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial and Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411

[16]

Regina S. Burachik, C. Yalçın Kaya. An update rule and a convergence result for a penalty function method. Journal of Industrial and Management Optimization, 2007, 3 (2) : 381-398. doi: 10.3934/jimo.2007.3.381

[17]

Yanmei Sun, Yakui Huang. An alternate gradient method for optimization problems with orthogonality constraints. Numerical Algebra, Control and Optimization, 2021, 11 (4) : 665-676. doi: 10.3934/naco.2021003

[18]

Shishun Li, Zhengda Huang. Guaranteed descent conjugate gradient methods with modified secant condition. Journal of Industrial and Management Optimization, 2008, 4 (4) : 739-755. doi: 10.3934/jimo.2008.4.739

[19]

Zhong Wan, Chaoming Hu, Zhanlu Yang. A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search. Discrete and Continuous Dynamical Systems - B, 2011, 16 (4) : 1157-1169. doi: 10.3934/dcdsb.2011.16.1157

[20]

M. S. Lee, B. S. Goh, H. G. Harno, K. H. Lim. On a two-phase approximate greatest descent method for nonlinear optimization with equality constraints. Numerical Algebra, Control and Optimization, 2018, 8 (3) : 315-326. doi: 10.3934/naco.2018020

2020 Impact Factor: 1.801

Metrics

  • PDF downloads (244)
  • HTML views (420)
  • Cited by (1)

Other articles
by authors

[Back to Top]