May  2020, 16(3): 1503-1518. doi: 10.3934/jimo.2019013

A class of accelerated conjugate-gradient-like methods based on a modified secant equation

Department of Applied Mathematics, Hainan University, Haikou 570228, China

* Corresponding author: Yigui Ou

Received  May 2018 Revised  September 2018 Published  May 2020 Early access  March 2019

Fund Project: The work is supported by NNSF of China (Nos. 11261015, 11761025) and NSF of Hainan Province (Nos. 111001, 2016CXTD004)

This paper proposes a new class of accelerated conjugate-gradient-like algorithms for solving large scale unconstrained optimization problems, which combine the idea of accelerated adaptive Perry conjugate gradient algorithms proposed by Andrei (2017) with the modified secant condition and the nonmonotone line search technique. An attractive property of the proposed methods is that the search direction always provides sufficient descent step which is independent of the line search used and the convexity of objective function. Under common assumptions, it is proven that the proposed methods possess global convergence for nonconvex smooth functions, and R-linear convergence for uniformly convex functions. The numerical experiments show the efficiency of the proposed method in practical computations.

Citation: Yigui Ou, Haichan Lin. A class of accelerated conjugate-gradient-like methods based on a modified secant equation. Journal of Industrial and Management Optimization, 2020, 16 (3) : 1503-1518. doi: 10.3934/jimo.2019013
References:
[1]

N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416.  doi: 10.1007/s10589-007-9055-7.

[2]

N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369.  doi: 10.1016/j.amc.2009.03.020.

[3]

N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164.  doi: 10.1016/j.cam.2017.04.045.

[4]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160.  doi: 10.1145/200979.201043.

[5]

Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936.  doi: 10.1016/j.na.2010.09.046.

[6]

Z. F. DaiX. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386.  doi: 10.1016/j.amc.2015.08.014.

[7]

Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93.  doi: 10.1007/s11075-011-9477-2.

[8]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330.  doi: 10.1023/A:1013653923062.

[9]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.

[10]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263.

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716.  doi: 10.1137/0723046.

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172.  doi: 10.1016/j.camwa.2007.08.038.

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192.  doi: 10.1137/030601880.

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. 

[15]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.

[16]

I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207.  doi: 10.1016/j.amc.2012.02.076.

[17]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874.

[18]

Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106.  doi: 10.1016/j.cam.2017.10.024.

[19]

D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94.  doi: 10.1145/355666.355673.

[20]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006.

[21]

S. W. YaoD. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269.  doi: 10.1007/s11075-017-0422-x.

[22]

G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21.  doi: 10.1007/s11590-008-0086-5.

[23]

G. L. YuanZ. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152.  doi: 10.1007/s10957-015-0781-1.

[24]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167.  doi: 10.1023/A:1021898630001.

[25]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056.  doi: 10.1137/S1052623403428208.

show all references

References:
[1]

N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416.  doi: 10.1007/s10589-007-9055-7.

[2]

N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369.  doi: 10.1016/j.amc.2009.03.020.

[3]

N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164.  doi: 10.1016/j.cam.2017.04.045.

[4]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160.  doi: 10.1145/200979.201043.

[5]

Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936.  doi: 10.1016/j.na.2010.09.046.

[6]

Z. F. DaiX. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386.  doi: 10.1016/j.amc.2015.08.014.

[7]

Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93.  doi: 10.1007/s11075-011-9477-2.

[8]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330.  doi: 10.1023/A:1013653923062.

[9]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.

[10]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263.

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716.  doi: 10.1137/0723046.

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172.  doi: 10.1016/j.camwa.2007.08.038.

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192.  doi: 10.1137/030601880.

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. 

[15]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.

[16]

I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207.  doi: 10.1016/j.amc.2012.02.076.

[17]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874.

[18]

Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106.  doi: 10.1016/j.cam.2017.10.024.

[19]

D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94.  doi: 10.1145/355666.355673.

[20]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006.

[21]

S. W. YaoD. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269.  doi: 10.1007/s11075-017-0422-x.

[22]

G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21.  doi: 10.1007/s11590-008-0086-5.

[23]

G. L. YuanZ. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152.  doi: 10.1007/s10957-015-0781-1.

[24]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167.  doi: 10.1023/A:1021898630001.

[25]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056.  doi: 10.1137/S1052623403428208.

Figure 1.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Figure 2.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Figure 3.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Table 1.  List of the test problems
No.Description of functionDim.No.Description of functionDim.
1Ex. Freud. Roth1000035Ex. Tri. Diag 210000
2Ex. Trigonometric1000036FLETCBV3(CUTE)1000
3Ex. Rosenbrock1000037FLETCHCR(CUTE)5000
4Ex. White and Holst1000038BDQRTIC(CUTE)10000
5Ex. Beale1000039TRIDIA(CUTE)10000
6Ex. Penalty1000040ARGLINB(CUTE)10000
7Perturbed Quad.1000041ARWHEAD(CUTE)10000
8Raydan 11000042NONDIA(CUTE)10000
9Raydan 21000043NONDQUAR(CUTE)10000
10Diagonal 11000044DQDRTIC(CUTE)10000
11Diagonal 21000045EG2(CUTE)10000
12Diagonal 31000046CURLY20(CUTE)1000
13Hager Function1000047DIXMAANA(CUTE)9000
14Generalized Tri. Diag. 11000048DIXMAANB(CUTE)9000
15Ex. Tri. Diag. 11000049DIXMAANC(CUTE)9000
16Ex. Three Exp.1000050DIXMAAND(CUTE)9000
17Generalized Tri. Diag. 21000051DIXMAANE(CUTE)9000
18Diagonal 41000052DIXMAANF(CUTE)9000
19Diagonal 51000053DIXMAANG(CUTE)9000
20Ex. Himmelblau1000054DIXMAANH(CUTE)9000
21Generalized PSC11000055DIXMAANI(CUTE)9000
22Ex. PSC11000056DIXMAANJ(CUTE)9000
23Ex. Powell1000057DIXMAANK(CUTE)9000
24Ex. Block-Diag. BD11000058DIXMAANL(CUTE)9000
25Ex. Maratos1000059LIARWHD(CUTE)10000
26Ex. Cliff1000060POWER(CUTE)5000
27Quad. Dia. Perturbed1000061ENGVAL1(CUTE)10000
28Ex. wood1000062EDENSCH(CUTE)10000
29Ex. Hiebert1000063VARDIM(CUTE)10000
30Quadratic QF11000064QUARTC(CUTE)10000
31Ex. Quad. Pena. QP11000065SINQUAD(CUTE)5000
32Ex. Quad. Pena. QP2500066DENSCHNB(CUTE)10000
33Quadratic QF21000067DENSCHNF(CUTE)10000
34Ex. EP11000068COSINE(CUTE)10000
No.Description of functionDim.No.Description of functionDim.
1Ex. Freud. Roth1000035Ex. Tri. Diag 210000
2Ex. Trigonometric1000036FLETCBV3(CUTE)1000
3Ex. Rosenbrock1000037FLETCHCR(CUTE)5000
4Ex. White and Holst1000038BDQRTIC(CUTE)10000
5Ex. Beale1000039TRIDIA(CUTE)10000
6Ex. Penalty1000040ARGLINB(CUTE)10000
7Perturbed Quad.1000041ARWHEAD(CUTE)10000
8Raydan 11000042NONDIA(CUTE)10000
9Raydan 21000043NONDQUAR(CUTE)10000
10Diagonal 11000044DQDRTIC(CUTE)10000
11Diagonal 21000045EG2(CUTE)10000
12Diagonal 31000046CURLY20(CUTE)1000
13Hager Function1000047DIXMAANA(CUTE)9000
14Generalized Tri. Diag. 11000048DIXMAANB(CUTE)9000
15Ex. Tri. Diag. 11000049DIXMAANC(CUTE)9000
16Ex. Three Exp.1000050DIXMAAND(CUTE)9000
17Generalized Tri. Diag. 21000051DIXMAANE(CUTE)9000
18Diagonal 41000052DIXMAANF(CUTE)9000
19Diagonal 51000053DIXMAANG(CUTE)9000
20Ex. Himmelblau1000054DIXMAANH(CUTE)9000
21Generalized PSC11000055DIXMAANI(CUTE)9000
22Ex. PSC11000056DIXMAANJ(CUTE)9000
23Ex. Powell1000057DIXMAANK(CUTE)9000
24Ex. Block-Diag. BD11000058DIXMAANL(CUTE)9000
25Ex. Maratos1000059LIARWHD(CUTE)10000
26Ex. Cliff1000060POWER(CUTE)5000
27Quad. Dia. Perturbed1000061ENGVAL1(CUTE)10000
28Ex. wood1000062EDENSCH(CUTE)10000
29Ex. Hiebert1000063VARDIM(CUTE)10000
30Quadratic QF11000064QUARTC(CUTE)10000
31Ex. Quad. Pena. QP11000065SINQUAD(CUTE)5000
32Ex. Quad. Pena. QP2500066DENSCHNB(CUTE)10000
33Quadratic QF21000067DENSCHNF(CUTE)10000
34Ex. EP11000068COSINE(CUTE)10000
[1]

Hong Seng Sim, Chuei Yee Chen, Wah June Leong, Jiao Li. Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization. Journal of Industrial and Management Optimization, 2022, 18 (6) : 3975-3988. doi: 10.3934/jimo.2021143

[2]

Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe. Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization. Journal of Industrial and Management Optimization, 2019, 15 (4) : 1773-1793. doi: 10.3934/jimo.2018122

[3]

Yigui Ou, Xin Zhou. A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization. Journal of Industrial and Management Optimization, 2018, 14 (2) : 785-801. doi: 10.3934/jimo.2017075

[4]

Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control and Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024

[5]

Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial and Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038

[6]

Regina S. Burachik, C. Yalçın Kaya. An update rule and a convergence result for a penalty function method. Journal of Industrial and Management Optimization, 2007, 3 (2) : 381-398. doi: 10.3934/jimo.2007.3.381

[7]

Weijun Zhou, Youhua Zhou. On the strong convergence of a modified Hestenes-Stiefel method for nonconvex optimization. Journal of Industrial and Management Optimization, 2013, 9 (4) : 893-899. doi: 10.3934/jimo.2013.9.893

[8]

Ronan Costaouec, Haoyun Feng, Jesús Izaguirre, Eric Darve. Analysis of the accelerated weighted ensemble methodology. Conference Publications, 2013, 2013 (special) : 171-181. doi: 10.3934/proc.2013.2013.171

[9]

Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial and Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411

[10]

Matúš Tibenský, Angela Handlovičová. Convergence analysis of the discrete duality finite volume scheme for the regularised Heston model. Discrete and Continuous Dynamical Systems - S, 2021, 14 (3) : 1181-1195. doi: 10.3934/dcdss.2020226

[11]

Shishun Li, Zhengda Huang. Guaranteed descent conjugate gradient methods with modified secant condition. Journal of Industrial and Management Optimization, 2008, 4 (4) : 739-755. doi: 10.3934/jimo.2008.4.739

[12]

Liu Liu. Uniform spectral convergence of the stochastic Galerkin method for the linear semiconductor Boltzmann equation with random inputs and diffusive scaling. Kinetic and Related Models, 2018, 11 (5) : 1139-1156. doi: 10.3934/krm.2018044

[13]

Kenji Nakanishi. Modified wave operators for the Hartree equation with data, image and convergence in the same space. Communications on Pure and Applied Analysis, 2002, 1 (2) : 237-252. doi: 10.3934/cpaa.2002.1.237

[14]

Yuan Shen, Chang Liu, Yannian Zuo, Xingying Zhang. A modified self-adaptive dual ascent method with relaxed stepsize condition for linearly constrained quadratic convex optimization. Journal of Industrial and Management Optimization, 2022  doi: 10.3934/jimo.2022101

[15]

Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial and Management Optimization, 2020, 16 (1) : 245-260. doi: 10.3934/jimo.2018149

[16]

Sarra Delladji, Mohammed Belloufi, Badreddine Sellami. Behavior of the combination of PRP and HZ methods for unconstrained optimization. Numerical Algebra, Control and Optimization, 2021, 11 (3) : 377-389. doi: 10.3934/naco.2020032

[17]

Marek Fila, Michael Winkler, Eiji Yanagida. Convergence to self-similar solutions for a semilinear parabolic equation. Discrete and Continuous Dynamical Systems, 2008, 21 (3) : 703-716. doi: 10.3934/dcds.2008.21.703

[18]

Sohana Jahan. Discriminant analysis of regularized multidimensional scaling. Numerical Algebra, Control and Optimization, 2021, 11 (2) : 255-267. doi: 10.3934/naco.2020024

[19]

Zehui Jia, Xue Gao, Xingju Cai, Deren Han. The convergence rate analysis of the symmetric ADMM for the nonconvex separable optimization problems. Journal of Industrial and Management Optimization, 2021, 17 (4) : 1943-1971. doi: 10.3934/jimo.2020053

[20]

Qiyuan Wei, Liwei Zhang. An accelerated differential equation system for generalized equations. Journal of Industrial and Management Optimization, 2023, 19 (1) : 529-548. doi: 10.3934/jimo.2021195

2021 Impact Factor: 1.411

Metrics

  • PDF downloads (261)
  • HTML views (913)
  • Cited by (0)

Other articles
by authors

[Back to Top]