May  2020, 16(3): 1503-1518. doi: 10.3934/jimo.2019013

A class of accelerated conjugate-gradient-like methods based on a modified secant equation

Department of Applied Mathematics, Hainan University, Haikou 570228, China

* Corresponding author: Yigui Ou

Received  May 2018 Revised  September 2018 Published  March 2019

Fund Project: The work is supported by NNSF of China (Nos. 11261015, 11761025) and NSF of Hainan Province (Nos. 111001, 2016CXTD004)

This paper proposes a new class of accelerated conjugate-gradient-like algorithms for solving large scale unconstrained optimization problems, which combine the idea of accelerated adaptive Perry conjugate gradient algorithms proposed by Andrei (2017) with the modified secant condition and the nonmonotone line search technique. An attractive property of the proposed methods is that the search direction always provides sufficient descent step which is independent of the line search used and the convexity of objective function. Under common assumptions, it is proven that the proposed methods possess global convergence for nonconvex smooth functions, and R-linear convergence for uniformly convex functions. The numerical experiments show the efficiency of the proposed method in practical computations.

Citation: Yigui Ou, Haichan Lin. A class of accelerated conjugate-gradient-like methods based on a modified secant equation. Journal of Industrial & Management Optimization, 2020, 16 (3) : 1503-1518. doi: 10.3934/jimo.2019013
References:
[1]

N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416.  doi: 10.1007/s10589-007-9055-7.  Google Scholar

[2]

N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369.  doi: 10.1016/j.amc.2009.03.020.  Google Scholar

[3]

N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164.  doi: 10.1016/j.cam.2017.04.045.  Google Scholar

[4]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160.  doi: 10.1145/200979.201043.  Google Scholar

[5]

Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936.  doi: 10.1016/j.na.2010.09.046.  Google Scholar

[6]

Z. F. DaiX. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386.  doi: 10.1016/j.amc.2015.08.014.  Google Scholar

[7]

Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93.  doi: 10.1007/s11075-011-9477-2.  Google Scholar

[8]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330.  doi: 10.1023/A:1013653923062.  Google Scholar

[9]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000. Google Scholar

[10]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263.  Google Scholar

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716.  doi: 10.1137/0723046.  Google Scholar

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172.  doi: 10.1016/j.camwa.2007.08.038.  Google Scholar

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192.  doi: 10.1137/030601880.  Google Scholar

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58.   Google Scholar

[15]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.  Google Scholar

[16]

I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207.  doi: 10.1016/j.amc.2012.02.076.  Google Scholar

[17]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874.  Google Scholar

[18]

Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106.  doi: 10.1016/j.cam.2017.10.024.  Google Scholar

[19]

D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94.  doi: 10.1145/355666.355673.  Google Scholar

[20]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006.  Google Scholar

[21]

S. W. YaoD. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269.  doi: 10.1007/s11075-017-0422-x.  Google Scholar

[22]

G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21.  doi: 10.1007/s11590-008-0086-5.  Google Scholar

[23]

G. L. YuanZ. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152.  doi: 10.1007/s10957-015-0781-1.  Google Scholar

[24]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167.  doi: 10.1023/A:1021898630001.  Google Scholar

[25]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056.  doi: 10.1137/S1052623403428208.  Google Scholar

show all references

References:
[1]

N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416.  doi: 10.1007/s10589-007-9055-7.  Google Scholar

[2]

N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369.  doi: 10.1016/j.amc.2009.03.020.  Google Scholar

[3]

N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164.  doi: 10.1016/j.cam.2017.04.045.  Google Scholar

[4]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160.  doi: 10.1145/200979.201043.  Google Scholar

[5]

Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936.  doi: 10.1016/j.na.2010.09.046.  Google Scholar

[6]

Z. F. DaiX. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386.  doi: 10.1016/j.amc.2015.08.014.  Google Scholar

[7]

Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93.  doi: 10.1007/s11075-011-9477-2.  Google Scholar

[8]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330.  doi: 10.1023/A:1013653923062.  Google Scholar

[9]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000. Google Scholar

[10]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263.  Google Scholar

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716.  doi: 10.1137/0723046.  Google Scholar

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172.  doi: 10.1016/j.camwa.2007.08.038.  Google Scholar

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192.  doi: 10.1137/030601880.  Google Scholar

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58.   Google Scholar

[15]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.  Google Scholar

[16]

I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207.  doi: 10.1016/j.amc.2012.02.076.  Google Scholar

[17]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874.  Google Scholar

[18]

Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106.  doi: 10.1016/j.cam.2017.10.024.  Google Scholar

[19]

D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94.  doi: 10.1145/355666.355673.  Google Scholar

[20]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006.  Google Scholar

[21]

S. W. YaoD. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269.  doi: 10.1007/s11075-017-0422-x.  Google Scholar

[22]

G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21.  doi: 10.1007/s11590-008-0086-5.  Google Scholar

[23]

G. L. YuanZ. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152.  doi: 10.1007/s10957-015-0781-1.  Google Scholar

[24]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167.  doi: 10.1023/A:1021898630001.  Google Scholar

[25]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056.  doi: 10.1137/S1052623403428208.  Google Scholar

Figure 1.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Figure 2.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Figure 3.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Table 1.  List of the test problems
No.Description of functionDim.No.Description of functionDim.
1Ex. Freud. Roth1000035Ex. Tri. Diag 210000
2Ex. Trigonometric1000036FLETCBV3(CUTE)1000
3Ex. Rosenbrock1000037FLETCHCR(CUTE)5000
4Ex. White and Holst1000038BDQRTIC(CUTE)10000
5Ex. Beale1000039TRIDIA(CUTE)10000
6Ex. Penalty1000040ARGLINB(CUTE)10000
7Perturbed Quad.1000041ARWHEAD(CUTE)10000
8Raydan 11000042NONDIA(CUTE)10000
9Raydan 21000043NONDQUAR(CUTE)10000
10Diagonal 11000044DQDRTIC(CUTE)10000
11Diagonal 21000045EG2(CUTE)10000
12Diagonal 31000046CURLY20(CUTE)1000
13Hager Function1000047DIXMAANA(CUTE)9000
14Generalized Tri. Diag. 11000048DIXMAANB(CUTE)9000
15Ex. Tri. Diag. 11000049DIXMAANC(CUTE)9000
16Ex. Three Exp.1000050DIXMAAND(CUTE)9000
17Generalized Tri. Diag. 21000051DIXMAANE(CUTE)9000
18Diagonal 41000052DIXMAANF(CUTE)9000
19Diagonal 51000053DIXMAANG(CUTE)9000
20Ex. Himmelblau1000054DIXMAANH(CUTE)9000
21Generalized PSC11000055DIXMAANI(CUTE)9000
22Ex. PSC11000056DIXMAANJ(CUTE)9000
23Ex. Powell1000057DIXMAANK(CUTE)9000
24Ex. Block-Diag. BD11000058DIXMAANL(CUTE)9000
25Ex. Maratos1000059LIARWHD(CUTE)10000
26Ex. Cliff1000060POWER(CUTE)5000
27Quad. Dia. Perturbed1000061ENGVAL1(CUTE)10000
28Ex. wood1000062EDENSCH(CUTE)10000
29Ex. Hiebert1000063VARDIM(CUTE)10000
30Quadratic QF11000064QUARTC(CUTE)10000
31Ex. Quad. Pena. QP11000065SINQUAD(CUTE)5000
32Ex. Quad. Pena. QP2500066DENSCHNB(CUTE)10000
33Quadratic QF21000067DENSCHNF(CUTE)10000
34Ex. EP11000068COSINE(CUTE)10000
No.Description of functionDim.No.Description of functionDim.
1Ex. Freud. Roth1000035Ex. Tri. Diag 210000
2Ex. Trigonometric1000036FLETCBV3(CUTE)1000
3Ex. Rosenbrock1000037FLETCHCR(CUTE)5000
4Ex. White and Holst1000038BDQRTIC(CUTE)10000
5Ex. Beale1000039TRIDIA(CUTE)10000
6Ex. Penalty1000040ARGLINB(CUTE)10000
7Perturbed Quad.1000041ARWHEAD(CUTE)10000
8Raydan 11000042NONDIA(CUTE)10000
9Raydan 21000043NONDQUAR(CUTE)10000
10Diagonal 11000044DQDRTIC(CUTE)10000
11Diagonal 21000045EG2(CUTE)10000
12Diagonal 31000046CURLY20(CUTE)1000
13Hager Function1000047DIXMAANA(CUTE)9000
14Generalized Tri. Diag. 11000048DIXMAANB(CUTE)9000
15Ex. Tri. Diag. 11000049DIXMAANC(CUTE)9000
16Ex. Three Exp.1000050DIXMAAND(CUTE)9000
17Generalized Tri. Diag. 21000051DIXMAANE(CUTE)9000
18Diagonal 41000052DIXMAANF(CUTE)9000
19Diagonal 51000053DIXMAANG(CUTE)9000
20Ex. Himmelblau1000054DIXMAANH(CUTE)9000
21Generalized PSC11000055DIXMAANI(CUTE)9000
22Ex. PSC11000056DIXMAANJ(CUTE)9000
23Ex. Powell1000057DIXMAANK(CUTE)9000
24Ex. Block-Diag. BD11000058DIXMAANL(CUTE)9000
25Ex. Maratos1000059LIARWHD(CUTE)10000
26Ex. Cliff1000060POWER(CUTE)5000
27Quad. Dia. Perturbed1000061ENGVAL1(CUTE)10000
28Ex. wood1000062EDENSCH(CUTE)10000
29Ex. Hiebert1000063VARDIM(CUTE)10000
30Quadratic QF11000064QUARTC(CUTE)10000
31Ex. Quad. Pena. QP11000065SINQUAD(CUTE)5000
32Ex. Quad. Pena. QP2500066DENSCHNB(CUTE)10000
33Quadratic QF21000067DENSCHNF(CUTE)10000
34Ex. EP11000068COSINE(CUTE)10000
[1]

Thierry Horsin, Mohamed Ali Jendoubi. On the convergence to equilibria of a sequence defined by an implicit scheme. Discrete & Continuous Dynamical Systems - S, 2020  doi: 10.3934/dcdss.2020465

[2]

Parikshit Upadhyaya, Elias Jarlebring, Emanuel H. Rubensson. A density matrix approach to the convergence of the self-consistent field iteration. Numerical Algebra, Control & Optimization, 2021, 11 (1) : 99-115. doi: 10.3934/naco.2020018

[3]

M. S. Lee, H. G. Harno, B. S. Goh, K. H. Lim. On the bang-bang control approach via a component-wise line search strategy for unconstrained optimization. Numerical Algebra, Control & Optimization, 2021, 11 (1) : 45-61. doi: 10.3934/naco.2020014

[4]

Guido Cavallaro, Roberto Garra, Carlo Marchioro. Long time localization of modified surface quasi-geostrophic equations. Discrete & Continuous Dynamical Systems - B, 2020  doi: 10.3934/dcdsb.2020336

[5]

Predrag S. Stanimirović, Branislav Ivanov, Haifeng Ma, Dijana Mosić. A survey of gradient methods for solving nonlinear optimization. Electronic Research Archive, 2020, 28 (4) : 1573-1624. doi: 10.3934/era.2020115

[6]

Alberto Bressan, Wen Shen. A posteriori error estimates for self-similar solutions to the Euler equations. Discrete & Continuous Dynamical Systems - A, 2021, 41 (1) : 113-130. doi: 10.3934/dcds.2020168

[7]

Min Chen, Olivier Goubet, Shenghao Li. Mathematical analysis of bump to bucket problem. Communications on Pure & Applied Analysis, 2020, 19 (12) : 5567-5580. doi: 10.3934/cpaa.2020251

[8]

Alberto Bressan, Sondre Tesdal Galtung. A 2-dimensional shape optimization problem for tree branches. Networks & Heterogeneous Media, 2020  doi: 10.3934/nhm.2020031

[9]

Qianqian Han, Xiao-Song Yang. Qualitative analysis of a generalized Nosé-Hoover oscillator. Discrete & Continuous Dynamical Systems - B, 2020  doi: 10.3934/dcdsb.2020346

[10]

Laurence Cherfils, Stefania Gatti, Alain Miranville, Rémy Guillevin. Analysis of a model for tumor growth and lactate exchanges in a glioma. Discrete & Continuous Dynamical Systems - S, 2020  doi: 10.3934/dcdss.2020457

[11]

Vieri Benci, Sunra Mosconi, Marco Squassina. Preface: Applications of mathematical analysis to problems in theoretical physics. Discrete & Continuous Dynamical Systems - S, 2020  doi: 10.3934/dcdss.2020446

[12]

Hai-Feng Huo, Shi-Ke Hu, Hong Xiang. Traveling wave solution for a diffusion SEIR epidemic model with self-protection and treatment. Electronic Research Archive, , () : -. doi: 10.3934/era.2020118

[13]

Yining Cao, Chuck Jia, Roger Temam, Joseph Tribbia. Mathematical analysis of a cloud resolving model including the ice microphysics. Discrete & Continuous Dynamical Systems - A, 2021, 41 (1) : 131-167. doi: 10.3934/dcds.2020219

[14]

Xin Guo, Lei Shi. Preface of the special issue on analysis in data science: Methods and applications. Mathematical Foundations of Computing, 2020, 3 (4) : i-ii. doi: 10.3934/mfc.2020026

[15]

Mohammed Abdulrazaq Kahya, Suhaib Abduljabbar Altamir, Zakariya Yahya Algamal. Improving whale optimization algorithm for feature selection with a time-varying transfer function. Numerical Algebra, Control & Optimization, 2021, 11 (1) : 87-98. doi: 10.3934/naco.2020017

[16]

Zuliang Lu, Fei Huang, Xiankui Wu, Lin Li, Shang Liu. Convergence and quasi-optimality of $ L^2- $norms based an adaptive finite element method for nonlinear optimal control problems. Electronic Research Archive, 2020, 28 (4) : 1459-1486. doi: 10.3934/era.2020077

[17]

Jianquan Li, Xin Xie, Dian Zhang, Jia Li, Xiaolin Lin. Qualitative analysis of a simple tumor-immune system with time delay of tumor action. Discrete & Continuous Dynamical Systems - B, 2020  doi: 10.3934/dcdsb.2020341

[18]

A. M. Elaiw, N. H. AlShamrani, A. Abdel-Aty, H. Dutta. Stability analysis of a general HIV dynamics model with multi-stages of infected cells and two routes of infection. Discrete & Continuous Dynamical Systems - S, 2020  doi: 10.3934/dcdss.2020441

[19]

Peter Poláčik, Pavol Quittner. Entire and ancient solutions of a supercritical semilinear heat equation. Discrete & Continuous Dynamical Systems - A, 2021, 41 (1) : 413-438. doi: 10.3934/dcds.2020136

[20]

Jianhua Huang, Yanbin Tang, Ming Wang. Singular support of the global attractor for a damped BBM equation. Discrete & Continuous Dynamical Systems - B, 2020  doi: 10.3934/dcdsb.2020345

2019 Impact Factor: 1.366

Metrics

  • PDF downloads (124)
  • HTML views (710)
  • Cited by (0)

Other articles
by authors

[Back to Top]