
-
Previous Article
The bundle scheme for solving arbitrary eigenvalue optimizations
- JIMO Home
- This Issue
-
Next Article
Distribution-free solutions to the extended multi-period newsboy problem
A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update
1. | Department of Mathematics, Faculty of Mathematics, Statistics and Computer Science, Semnan University, P.O. Box: 35195–363, Semnan, Iran |
2. | Faculty of Mathematical Sciences, Ferdowsi University of Mashhad, P.O. Box: 9177948953, Mashhad, Iran |
Hybridizing the three–term conjugate gradient method proposed by Zhang et al. and the nonlinear conjugate gradient method proposed by Dai and Liao based on the scaled memoryless BFGS update, a one–parameter class of four–term conjugate gradient methods is proposed. It is shown that the suggested class of conjugate gradient methods possesses the sufficient descent property, without convexity assumption on the objective function. A brief global convergence analysis is made for uniformly convex objective functions. Results of numerical comparisons are reported. They demonstrate efficiency of a method of the proposed class in the sense of the Dolan–Moré performance profile.
References:
[1] |
N. Andrei,
Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, European J. Oper. Res., 204 (2010), 410-420.
doi: 10.1016/j.ejor.2009.11.030. |
[2] |
S. Babaie–Kafaki,
On the sufficient descent condition of the Hager–Zhang conjugate gradient methods, 4OR, 12 (2014), 285-292.
doi: 10.1007/s10288-014-0255-6. |
[3] |
S. Babaie–Kafaki and R. Ghanbari,
A descent family of Dai–Liao conjugate gradient methods, Optim. Methods Softw., 29 (2014), 583-591.
doi: 10.1080/10556788.2013.833199. |
[4] |
S. Babaie–Kafaki and R. Ghanbari,
Two modified three–term conjugate gradient methods with sufficient descent property, Optim. Lett., 8 (2014), 2285-2297.
doi: 10.1007/s11590-014-0736-8. |
[5] |
S. Babaie–Kafaki and R. Ghanbari,
A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least–squares approach, Optim. Methods Softw., 30 (2015), 673-681.
doi: 10.1080/10556788.2014.966825. |
[6] |
S. Babaie–Kafaki, R. Ghanbari and N. Mahdavi–Amiri,
Two new conjugate gradient methods based on modified secant equations, J. Comput. Appl. Math., 234 (2010), 1374-1386.
doi: 10.1016/j.cam.2010.01.052. |
[7] |
J. Barzilai and J. Borwein,
Two–point stepsize gradient methods, IMA J. Numer. Anal., 8 (1988), 141-148.
doi: 10.1093/imanum/8.1.141. |
[8] |
C. Broyden,
The convergence of a class of double–rank minimization algorithms. Ⅱ. The new algorithm, J. Inst. Math. Appl., 6 (1970), 222-231.
doi: 10.1093/imamat/6.3.222. |
[9] |
Y. Dai and C. Kou,
A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim., 23 (2013), 296-320.
doi: 10.1137/100813026. |
[10] |
Y. Dai and L. Liao,
New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2001), 87-101.
doi: 10.1007/s002450010019. |
[11] |
E. Dolan and J. Moré,
Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.
doi: 10.1007/s101070100263. |
[12] |
R. Fletcher,
A new approach to variable metric algorithms, Comput. J., 13 (1970), 317-322.
doi: 10.1093/comjnl/13.3.317. |
[13] |
J. Ford and I. Moghrabi,
Multi–step quasi–Newton methods for optimization, J. Comput. Appl. Math., 50 (1994), 305-323.
doi: 10.1016/0377-0427(94)90309-3. |
[14] |
J. Ford, Y. Narushima and H. Yabe,
Multi–step nonlinear conjugate gradient methods for unconstrained minimization, Comput. Optim. Appl., 40 (2008), 191-216.
doi: 10.1007/s10589-007-9087-z. |
[15] |
D. Goldfarb,
A family of variable metric methods derived by variational means, Math. Comp., 24 (1970), 23-26.
doi: 10.1090/S0025-5718-1970-0258249-6. |
[16] |
N. Gould, D. Orban and P. Toint,
CUTEr: a constrained and unconstrained testing environment, revisited, ACM Trans. Math. Softw., 29 (2003), 353-372.
doi: 10.1145/962437.962438. |
[17] |
W. Hager and H. Zhang,
A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16 (2005), 170-192.
doi: 10.1137/030601880. |
[18] |
W. Hager and H. Zhang,
Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw., 32 (2006), 113-137.
doi: 10.1145/1132973.1132979. |
[19] |
W. Hager and H. Zhang,
A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2 (2006), 35-58.
|
[20] |
M. Hestenes and E. Stiefel,
Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), 409-436.
doi: 10.6028/jres.049.044. |
[21] |
D. Li and M. Fukushima,
A modified BFGS method and its global convergence in nonconvex minimization, J. Comput. Appl. Math., 129 (2001), 15-35.
doi: 10.1016/S0377-0427(00)00540-9. |
[22] |
G. Li, C. Tang and Z. Wei,
New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math., 202 (2007), 523-539.
doi: 10.1016/j.cam.2006.03.005. |
[23] |
D. Liu and J. Nocedal,
On the limited memory BFGS method for large–scale optimization, Math. Program., 45 (1989), 503-528.
doi: 10.1007/BF01589116. |
[24] |
S. Oren,
Self–scaling variable metric (SSVM) algorithms. Ⅱ. Implementation and experiments, Management Sci., 20 (1974), 863-874.
|
[25] |
S. Oren and D. Luenberger,
Self–scaling variable metric (SSVM) algorithms. Ⅰ. Criteria and sufficient conditions for scaling a class of algorithms, Management Sci., 20 (1973/74), 845-862.
|
[26] |
S. Oren and E. Spedicato,
Optimal conditioning of self–scaling variable metric algorithms, Math. Program., 10 (1976), 70-90.
doi: 10.1007/BF01580654. |
[27] |
D. Shanno,
Conditioning of quasi–Newton methods for function minimization, Math. Comp., 24 (1970), 647-656.
doi: 10.1090/S0025-5718-1970-0274029-X. |
[28] |
K. Sugiki, Y. Narushima and H. Yabe,
Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, J. Optim. Theory Appl., 153 (2012), 733-757.
doi: 10.1007/s10957-011-9960-x. |
[29] |
W. Sun and Y. Yuan,
Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. |
[30] |
Z. Wei, G. Li and L. Qi,
New quasi–Newton methods for unconstrained optimization problems, Appl. Math. Comput., 175 (2006), 1156-1188.
doi: 10.1016/j.amc.2005.08.027. |
[31] |
H. Yabe and M. Takano,
Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Comput. Optim. Appl., 28 (2004), 203-225.
doi: 10.1023/B:COAP.0000026885.81997.88. |
[32] |
Y. Yuan,
A modified BFGS algorithm for unconstrained optimization, IMA J. Numer. Anal., 11 (1991), 325-332.
doi: 10.1093/imanum/11.3.325. |
[33] |
J. Zhang, N. Deng and L. Chen,
New quasi–Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl., 102 (1999), 147-167.
doi: 10.1023/A:1021898630001. |
[34] |
L. Zhang, W. Zhou and D. Li,
Some descent three–term conjugate gradient methods and their global convergence, Optim. Methods Softw., 22 (2007), 697-711.
doi: 10.1080/10556780701223293. |
[35] |
W. Zhou and L. Zhang,
A nonlinear conjugate gradient method based on the MBFGS secant condition, Optim. Methods Softw., 21 (2006), 707-714.
doi: 10.1080/10556780500137041. |
show all references
References:
[1] |
N. Andrei,
Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, European J. Oper. Res., 204 (2010), 410-420.
doi: 10.1016/j.ejor.2009.11.030. |
[2] |
S. Babaie–Kafaki,
On the sufficient descent condition of the Hager–Zhang conjugate gradient methods, 4OR, 12 (2014), 285-292.
doi: 10.1007/s10288-014-0255-6. |
[3] |
S. Babaie–Kafaki and R. Ghanbari,
A descent family of Dai–Liao conjugate gradient methods, Optim. Methods Softw., 29 (2014), 583-591.
doi: 10.1080/10556788.2013.833199. |
[4] |
S. Babaie–Kafaki and R. Ghanbari,
Two modified three–term conjugate gradient methods with sufficient descent property, Optim. Lett., 8 (2014), 2285-2297.
doi: 10.1007/s11590-014-0736-8. |
[5] |
S. Babaie–Kafaki and R. Ghanbari,
A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least–squares approach, Optim. Methods Softw., 30 (2015), 673-681.
doi: 10.1080/10556788.2014.966825. |
[6] |
S. Babaie–Kafaki, R. Ghanbari and N. Mahdavi–Amiri,
Two new conjugate gradient methods based on modified secant equations, J. Comput. Appl. Math., 234 (2010), 1374-1386.
doi: 10.1016/j.cam.2010.01.052. |
[7] |
J. Barzilai and J. Borwein,
Two–point stepsize gradient methods, IMA J. Numer. Anal., 8 (1988), 141-148.
doi: 10.1093/imanum/8.1.141. |
[8] |
C. Broyden,
The convergence of a class of double–rank minimization algorithms. Ⅱ. The new algorithm, J. Inst. Math. Appl., 6 (1970), 222-231.
doi: 10.1093/imamat/6.3.222. |
[9] |
Y. Dai and C. Kou,
A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim., 23 (2013), 296-320.
doi: 10.1137/100813026. |
[10] |
Y. Dai and L. Liao,
New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43 (2001), 87-101.
doi: 10.1007/s002450010019. |
[11] |
E. Dolan and J. Moré,
Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.
doi: 10.1007/s101070100263. |
[12] |
R. Fletcher,
A new approach to variable metric algorithms, Comput. J., 13 (1970), 317-322.
doi: 10.1093/comjnl/13.3.317. |
[13] |
J. Ford and I. Moghrabi,
Multi–step quasi–Newton methods for optimization, J. Comput. Appl. Math., 50 (1994), 305-323.
doi: 10.1016/0377-0427(94)90309-3. |
[14] |
J. Ford, Y. Narushima and H. Yabe,
Multi–step nonlinear conjugate gradient methods for unconstrained minimization, Comput. Optim. Appl., 40 (2008), 191-216.
doi: 10.1007/s10589-007-9087-z. |
[15] |
D. Goldfarb,
A family of variable metric methods derived by variational means, Math. Comp., 24 (1970), 23-26.
doi: 10.1090/S0025-5718-1970-0258249-6. |
[16] |
N. Gould, D. Orban and P. Toint,
CUTEr: a constrained and unconstrained testing environment, revisited, ACM Trans. Math. Softw., 29 (2003), 353-372.
doi: 10.1145/962437.962438. |
[17] |
W. Hager and H. Zhang,
A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16 (2005), 170-192.
doi: 10.1137/030601880. |
[18] |
W. Hager and H. Zhang,
Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw., 32 (2006), 113-137.
doi: 10.1145/1132973.1132979. |
[19] |
W. Hager and H. Zhang,
A survey of nonlinear conjugate gradient methods, Pac. J. Optim., 2 (2006), 35-58.
|
[20] |
M. Hestenes and E. Stiefel,
Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), 409-436.
doi: 10.6028/jres.049.044. |
[21] |
D. Li and M. Fukushima,
A modified BFGS method and its global convergence in nonconvex minimization, J. Comput. Appl. Math., 129 (2001), 15-35.
doi: 10.1016/S0377-0427(00)00540-9. |
[22] |
G. Li, C. Tang and Z. Wei,
New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math., 202 (2007), 523-539.
doi: 10.1016/j.cam.2006.03.005. |
[23] |
D. Liu and J. Nocedal,
On the limited memory BFGS method for large–scale optimization, Math. Program., 45 (1989), 503-528.
doi: 10.1007/BF01589116. |
[24] |
S. Oren,
Self–scaling variable metric (SSVM) algorithms. Ⅱ. Implementation and experiments, Management Sci., 20 (1974), 863-874.
|
[25] |
S. Oren and D. Luenberger,
Self–scaling variable metric (SSVM) algorithms. Ⅰ. Criteria and sufficient conditions for scaling a class of algorithms, Management Sci., 20 (1973/74), 845-862.
|
[26] |
S. Oren and E. Spedicato,
Optimal conditioning of self–scaling variable metric algorithms, Math. Program., 10 (1976), 70-90.
doi: 10.1007/BF01580654. |
[27] |
D. Shanno,
Conditioning of quasi–Newton methods for function minimization, Math. Comp., 24 (1970), 647-656.
doi: 10.1090/S0025-5718-1970-0274029-X. |
[28] |
K. Sugiki, Y. Narushima and H. Yabe,
Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, J. Optim. Theory Appl., 153 (2012), 733-757.
doi: 10.1007/s10957-011-9960-x. |
[29] |
W. Sun and Y. Yuan,
Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. |
[30] |
Z. Wei, G. Li and L. Qi,
New quasi–Newton methods for unconstrained optimization problems, Appl. Math. Comput., 175 (2006), 1156-1188.
doi: 10.1016/j.amc.2005.08.027. |
[31] |
H. Yabe and M. Takano,
Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Comput. Optim. Appl., 28 (2004), 203-225.
doi: 10.1023/B:COAP.0000026885.81997.88. |
[32] |
Y. Yuan,
A modified BFGS algorithm for unconstrained optimization, IMA J. Numer. Anal., 11 (1991), 325-332.
doi: 10.1093/imanum/11.3.325. |
[33] |
J. Zhang, N. Deng and L. Chen,
New quasi–Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl., 102 (1999), 147-167.
doi: 10.1023/A:1021898630001. |
[34] |
L. Zhang, W. Zhou and D. Li,
Some descent three–term conjugate gradient methods and their global convergence, Optim. Methods Softw., 22 (2007), 697-711.
doi: 10.1080/10556780701223293. |
[35] |
W. Zhou and L. Zhang,
A nonlinear conjugate gradient method based on the MBFGS secant condition, Optim. Methods Softw., 21 (2006), 707-714.
doi: 10.1080/10556780500137041. |


[1] |
Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control & Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024 |
[2] |
Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial & Management Optimization, 2020, 16 (1) : 245-260. doi: 10.3934/jimo.2018149 |
[3] |
Sarra Delladji, Mohammed Belloufi, Badreddine Sellami. Behavior of the combination of PRP and HZ methods for unconstrained optimization. Numerical Algebra, Control & Optimization, 2021, 11 (3) : 377-389. doi: 10.3934/naco.2020032 |
[4] |
Boris Kramer, John R. Singler. A POD projection method for large-scale algebraic Riccati equations. Numerical Algebra, Control & Optimization, 2016, 6 (4) : 413-435. doi: 10.3934/naco.2016018 |
[5] |
Zehui Jia, Xue Gao, Xingju Cai, Deren Han. The convergence rate analysis of the symmetric ADMM for the nonconvex separable optimization problems. Journal of Industrial & Management Optimization, 2021, 17 (4) : 1943-1971. doi: 10.3934/jimo.2020053 |
[6] |
Tengteng Yu, Xin-Wei Liu, Yu-Hong Dai, Jie Sun. Variable metric proximal stochastic variance reduced gradient methods for nonconvex nonsmooth optimization. Journal of Industrial & Management Optimization, 2021 doi: 10.3934/jimo.2021084 |
[7] |
Mohsen Abdolhosseinzadeh, Mir Mohammad Alipour. Design of experiment for tuning parameters of an ant colony optimization method for the constrained shortest Hamiltonian path problem in the grid networks. Numerical Algebra, Control & Optimization, 2021, 11 (2) : 321-332. doi: 10.3934/naco.2020028 |
[8] |
Mostafa Ghelichi, A. M. Goltabar, H. R. Tavakoli, A. Karamodin. Neuro-fuzzy active control optimized by Tug of war optimization method for seismically excited benchmark highway bridge. Numerical Algebra, Control & Optimization, 2021, 11 (3) : 333-351. doi: 10.3934/naco.2020029 |
[9] |
Wei Wang, Wanbiao Ma, Xiulan Lai. Sufficient conditions for global dynamics of a viral infection model with nonlinear diffusion. Discrete & Continuous Dynamical Systems - B, 2021, 26 (7) : 3989-4011. doi: 10.3934/dcdsb.2020271 |
[10] |
Eduardo Casas, Christian Clason, Arnd Rösch. Preface special issue on system modeling and optimization. Mathematical Control & Related Fields, 2021 doi: 10.3934/mcrf.2021008 |
[11] |
Ardeshir Ahmadi, Hamed Davari-Ardakani. A multistage stochastic programming framework for cardinality constrained portfolio optimization. Numerical Algebra, Control & Optimization, 2017, 7 (3) : 359-377. doi: 10.3934/naco.2017023 |
[12] |
Luke Finlay, Vladimir Gaitsgory, Ivan Lebedev. Linear programming solutions of periodic optimization problems: approximation of the optimal control. Journal of Industrial & Management Optimization, 2007, 3 (2) : 399-413. doi: 10.3934/jimo.2007.3.399 |
[13] |
Claudia Lederman, Noemi Wolanski. An optimization problem with volume constraint for an inhomogeneous operator with nonstandard growth. Discrete & Continuous Dynamical Systems, 2021, 41 (6) : 2907-2946. doi: 10.3934/dcds.2020391 |
[14] |
Prabhu Manyem. A note on optimization modelling of piecewise linear delay costing in the airline industry. Journal of Industrial & Management Optimization, 2021, 17 (4) : 1809-1823. doi: 10.3934/jimo.2020047 |
[15] |
Kai Cai, Guangyue Han. An optimization approach to the Langberg-Médard multiple unicast conjecture. Advances in Mathematics of Communications, 2021 doi: 10.3934/amc.2021001 |
[16] |
Hong-Yi Miao, Li Wang. Preconditioned inexact Newton-like method for large nonsymmetric eigenvalue problems. Numerical Algebra, Control & Optimization, 2021 doi: 10.3934/naco.2021012 |
[17] |
Jiangxing Wang. Convergence analysis of an accurate and efficient method for nonlinear Maxwell's equations. Discrete & Continuous Dynamical Systems - B, 2021, 26 (5) : 2429-2440. doi: 10.3934/dcdsb.2020185 |
[18] |
Xiaofei Liu, Yong Wang. Weakening convergence conditions of a potential reduction method for tensor complementarity problems. Journal of Industrial & Management Optimization, 2021 doi: 10.3934/jimo.2021080 |
[19] |
Yaonan Ma, Li-Zhi Liao. The Glowinski–Le Tallec splitting method revisited: A general convergence and convergence rate analysis. Journal of Industrial & Management Optimization, 2021, 17 (4) : 1681-1711. doi: 10.3934/jimo.2020040 |
[20] |
Miroslav Bulíček, Victoria Patel, Endre Süli, Yasemin Şengül. Existence of large-data global weak solutions to a model of a strain-limiting viscoelastic body. Communications on Pure & Applied Analysis, , () : -. doi: 10.3934/cpaa.2021053 |
2019 Impact Factor: 1.366
Tools
Metrics
Other articles
by authors
[Back to Top]