• Previous Article
    Continuous-time mean-variance portfolio selection with no-shorting constraints and regime-switching
  • JIMO Home
  • This Issue
  • Next Article
    Performance evaluation and Nash equilibrium of a cloud architecture with a sleeping mechanism and an enrollment service
doi: 10.3934/jimo.2019013

A class of accelerated conjugate-gradient-like methods based on a modified secant equation

Department of Applied Mathematics, Hainan University, Haikou 570228, China

* Corresponding author: Yigui Ou

Received  May 2018 Revised  September 2018 Published  March 2019

Fund Project: The work is supported by NNSF of China (Nos. 11261015, 11761025) and NSF of Hainan Province (Nos. 111001, 2016CXTD004)

This paper proposes a new class of accelerated conjugate-gradient-like algorithms for solving large scale unconstrained optimization problems, which combine the idea of accelerated adaptive Perry conjugate gradient algorithms proposed by Andrei (2017) with the modified secant condition and the nonmonotone line search technique. An attractive property of the proposed methods is that the search direction always provides sufficient descent step which is independent of the line search used and the convexity of objective function. Under common assumptions, it is proven that the proposed methods possess global convergence for nonconvex smooth functions, and R-linear convergence for uniformly convex functions. The numerical experiments show the efficiency of the proposed method in practical computations.

Citation: Yigui Ou, Haichan Lin. A class of accelerated conjugate-gradient-like methods based on a modified secant equation. Journal of Industrial & Management Optimization, doi: 10.3934/jimo.2019013
References:
[1]

N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416. doi: 10.1007/s10589-007-9055-7. Google Scholar

[2]

N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369. doi: 10.1016/j.amc.2009.03.020. Google Scholar

[3]

N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164. doi: 10.1016/j.cam.2017.04.045. Google Scholar

[4]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160. doi: 10.1145/200979.201043. Google Scholar

[5]

Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936. doi: 10.1016/j.na.2010.09.046. Google Scholar

[6]

Z. F. DaiX. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386. doi: 10.1016/j.amc.2015.08.014. Google Scholar

[7]

Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93. doi: 10.1007/s11075-011-9477-2. Google Scholar

[8]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330. doi: 10.1023/A:1013653923062. Google Scholar

[9]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.Google Scholar

[10]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263. Google Scholar

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716. doi: 10.1137/0723046. Google Scholar

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172. doi: 10.1016/j.camwa.2007.08.038. Google Scholar

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880. Google Scholar

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. Google Scholar

[15]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35. doi: 10.1016/S0377-0427(00)00540-9. Google Scholar

[16]

I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207. doi: 10.1016/j.amc.2012.02.076. Google Scholar

[17]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874. Google Scholar

[18]

Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106. doi: 10.1016/j.cam.2017.10.024. Google Scholar

[19]

D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94. doi: 10.1145/355666.355673. Google Scholar

[20]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. Google Scholar

[21]

S. W. YaoD. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269. doi: 10.1007/s11075-017-0422-x. Google Scholar

[22]

G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21. doi: 10.1007/s11590-008-0086-5. Google Scholar

[23]

G. L. YuanZ. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152. doi: 10.1007/s10957-015-0781-1. Google Scholar

[24]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167. doi: 10.1023/A:1021898630001. Google Scholar

[25]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056. doi: 10.1137/S1052623403428208. Google Scholar

show all references

References:
[1]

N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416. doi: 10.1007/s10589-007-9055-7. Google Scholar

[2]

N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied Mathematics and Computation, 213 (2009), 361-369. doi: 10.1016/j.amc.2009.03.020. Google Scholar

[3]

N. Andrei, Accerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, Journal of Computational and Applied Mathematics, 325 (2017), 149-164. doi: 10.1016/j.cam.2017.04.045. Google Scholar

[4]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environments, ACM Transactions on Mathematical Software, 21 (1995), 123-160. doi: 10.1145/200979.201043. Google Scholar

[5]

Z. F. Dai, Two modified HS type conjugate gradient methods for unconstrained optimization problems, Nonlinear Analysis: Theory Methods & Applications, 74 (2011), 927-936. doi: 10.1016/j.na.2010.09.046. Google Scholar

[6]

Z. F. DaiX. H. Chen and F. H. Wen, A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations, Applied Mathematics and Computation, 270 (2015), 378-386. doi: 10.1016/j.amc.2015.08.014. Google Scholar

[7]

Z. F. Dai and F. H. Wen, Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search, Numerical Algorithms, 59 (2012), 79-93. doi: 10.1007/s11075-011-9477-2. Google Scholar

[8]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330. doi: 10.1023/A:1013653923062. Google Scholar

[9]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.Google Scholar

[10]

E. D. Dolan and J. J. Mor$\acute{e}$, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263. Google Scholar

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716. doi: 10.1137/0723046. Google Scholar

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172. doi: 10.1016/j.camwa.2007.08.038. Google Scholar

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880. Google Scholar

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. Google Scholar

[15]

D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35. doi: 10.1016/S0377-0427(00)00540-9. Google Scholar

[16]

I. E. Livieris and P. intelas, Globally convergent modified Perry's conjugate gradient method, Applied Mathematics and Computation, 218 (2012), 9197-9207. doi: 10.1016/j.amc.2012.02.076. Google Scholar

[17]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874. Google Scholar

[18]

Y. G. Ou, A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods, Journal of Computational and Applied Mathematics, 332 (2018), 101-106. doi: 10.1016/j.cam.2017.10.024. Google Scholar

[19]

D. F. Shanno and K. H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Transactions on Mathematical Software, 2 (1976), 87-94. doi: 10.1145/355666.355673. Google Scholar

[20]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. Google Scholar

[21]

S. W. YaoD. L. He and L. H. Shi, An improved Perry conjugate gradient method with adaptive parameter choice, Numerical Algorithms, 78 (2018), 1255-1269. doi: 10.1007/s11075-017-0422-x. Google Scholar

[22]

G. L. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3 (2009), 11-21. doi: 10.1007/s11590-008-0086-5. Google Scholar

[23]

G. L. YuanZ. H. Meng and Y. Li, A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152. doi: 10.1007/s10957-015-0781-1. Google Scholar

[24]

J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167. doi: 10.1023/A:1021898630001. Google Scholar

[25]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056. doi: 10.1137/S1052623403428208. Google Scholar

Figure 1.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Figure 2.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Figure 3.  Performance profiles based on the CPU time (left) and the number $ N_f $ of function evaluations (right)
Table 1.  List of the test problems
No.Description of functionDim.No.Description of functionDim.
1Ex. Freud. Roth1000035Ex. Tri. Diag 210000
2Ex. Trigonometric1000036FLETCBV3(CUTE)1000
3Ex. Rosenbrock1000037FLETCHCR(CUTE)5000
4Ex. White and Holst1000038BDQRTIC(CUTE)10000
5Ex. Beale1000039TRIDIA(CUTE)10000
6Ex. Penalty1000040ARGLINB(CUTE)10000
7Perturbed Quad.1000041ARWHEAD(CUTE)10000
8Raydan 11000042NONDIA(CUTE)10000
9Raydan 21000043NONDQUAR(CUTE)10000
10Diagonal 11000044DQDRTIC(CUTE)10000
11Diagonal 21000045EG2(CUTE)10000
12Diagonal 31000046CURLY20(CUTE)1000
13Hager Function1000047DIXMAANA(CUTE)9000
14Generalized Tri. Diag. 11000048DIXMAANB(CUTE)9000
15Ex. Tri. Diag. 11000049DIXMAANC(CUTE)9000
16Ex. Three Exp.1000050DIXMAAND(CUTE)9000
17Generalized Tri. Diag. 21000051DIXMAANE(CUTE)9000
18Diagonal 41000052DIXMAANF(CUTE)9000
19Diagonal 51000053DIXMAANG(CUTE)9000
20Ex. Himmelblau1000054DIXMAANH(CUTE)9000
21Generalized PSC11000055DIXMAANI(CUTE)9000
22Ex. PSC11000056DIXMAANJ(CUTE)9000
23Ex. Powell1000057DIXMAANK(CUTE)9000
24Ex. Block-Diag. BD11000058DIXMAANL(CUTE)9000
25Ex. Maratos1000059LIARWHD(CUTE)10000
26Ex. Cliff1000060POWER(CUTE)5000
27Quad. Dia. Perturbed1000061ENGVAL1(CUTE)10000
28Ex. wood1000062EDENSCH(CUTE)10000
29Ex. Hiebert1000063VARDIM(CUTE)10000
30Quadratic QF11000064QUARTC(CUTE)10000
31Ex. Quad. Pena. QP11000065SINQUAD(CUTE)5000
32Ex. Quad. Pena. QP2500066DENSCHNB(CUTE)10000
33Quadratic QF21000067DENSCHNF(CUTE)10000
34Ex. EP11000068COSINE(CUTE)10000
No.Description of functionDim.No.Description of functionDim.
1Ex. Freud. Roth1000035Ex. Tri. Diag 210000
2Ex. Trigonometric1000036FLETCBV3(CUTE)1000
3Ex. Rosenbrock1000037FLETCHCR(CUTE)5000
4Ex. White and Holst1000038BDQRTIC(CUTE)10000
5Ex. Beale1000039TRIDIA(CUTE)10000
6Ex. Penalty1000040ARGLINB(CUTE)10000
7Perturbed Quad.1000041ARWHEAD(CUTE)10000
8Raydan 11000042NONDIA(CUTE)10000
9Raydan 21000043NONDQUAR(CUTE)10000
10Diagonal 11000044DQDRTIC(CUTE)10000
11Diagonal 21000045EG2(CUTE)10000
12Diagonal 31000046CURLY20(CUTE)1000
13Hager Function1000047DIXMAANA(CUTE)9000
14Generalized Tri. Diag. 11000048DIXMAANB(CUTE)9000
15Ex. Tri. Diag. 11000049DIXMAANC(CUTE)9000
16Ex. Three Exp.1000050DIXMAAND(CUTE)9000
17Generalized Tri. Diag. 21000051DIXMAANE(CUTE)9000
18Diagonal 41000052DIXMAANF(CUTE)9000
19Diagonal 51000053DIXMAANG(CUTE)9000
20Ex. Himmelblau1000054DIXMAANH(CUTE)9000
21Generalized PSC11000055DIXMAANI(CUTE)9000
22Ex. PSC11000056DIXMAANJ(CUTE)9000
23Ex. Powell1000057DIXMAANK(CUTE)9000
24Ex. Block-Diag. BD11000058DIXMAANL(CUTE)9000
25Ex. Maratos1000059LIARWHD(CUTE)10000
26Ex. Cliff1000060POWER(CUTE)5000
27Quad. Dia. Perturbed1000061ENGVAL1(CUTE)10000
28Ex. wood1000062EDENSCH(CUTE)10000
29Ex. Hiebert1000063VARDIM(CUTE)10000
30Quadratic QF11000064QUARTC(CUTE)10000
31Ex. Quad. Pena. QP11000065SINQUAD(CUTE)5000
32Ex. Quad. Pena. QP2500066DENSCHNB(CUTE)10000
33Quadratic QF21000067DENSCHNF(CUTE)10000
34Ex. EP11000068COSINE(CUTE)10000
[1]

Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe. Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization. Journal of Industrial & Management Optimization, 2019, 15 (4) : 1773-1793. doi: 10.3934/jimo.2018122

[2]

Yigui Ou, Xin Zhou. A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization. Journal of Industrial & Management Optimization, 2018, 14 (2) : 785-801. doi: 10.3934/jimo.2017075

[3]

Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control & Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024

[4]

Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial & Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038

[5]

Regina S. Burachik, C. Yalçın Kaya. An update rule and a convergence result for a penalty function method. Journal of Industrial & Management Optimization, 2007, 3 (2) : 381-398. doi: 10.3934/jimo.2007.3.381

[6]

Weijun Zhou, Youhua Zhou. On the strong convergence of a modified Hestenes-Stiefel method for nonconvex optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 893-899. doi: 10.3934/jimo.2013.9.893

[7]

Ronan Costaouec, Haoyun Feng, Jesús Izaguirre, Eric Darve. Analysis of the accelerated weighted ensemble methodology. Conference Publications, 2013, 2013 (special) : 171-181. doi: 10.3934/proc.2013.2013.171

[8]

Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial & Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411

[9]

Shishun Li, Zhengda Huang. Guaranteed descent conjugate gradient methods with modified secant condition. Journal of Industrial & Management Optimization, 2008, 4 (4) : 739-755. doi: 10.3934/jimo.2008.4.739

[10]

Kenji Nakanishi. Modified wave operators for the Hartree equation with data, image and convergence in the same space. Communications on Pure & Applied Analysis, 2002, 1 (2) : 237-252. doi: 10.3934/cpaa.2002.1.237

[11]

Liu Liu. Uniform spectral convergence of the stochastic Galerkin method for the linear semiconductor Boltzmann equation with random inputs and diffusive scaling. Kinetic & Related Models, 2018, 11 (5) : 1139-1156. doi: 10.3934/krm.2018044

[12]

Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-16. doi: 10.3934/jimo.2018149

[13]

Marek Fila, Michael Winkler, Eiji Yanagida. Convergence to self-similar solutions for a semilinear parabolic equation. Discrete & Continuous Dynamical Systems - A, 2008, 21 (3) : 703-716. doi: 10.3934/dcds.2008.21.703

[14]

Andreas C. Aristotelous, Ohannes Karakashian, Steven M. Wise. A mixed discontinuous Galerkin, convex splitting scheme for a modified Cahn-Hilliard equation and an efficient nonlinear multigrid solver. Discrete & Continuous Dynamical Systems - B, 2013, 18 (9) : 2211-2238. doi: 10.3934/dcdsb.2013.18.2211

[15]

Yong Duan, Jian-Guo Liu. Convergence analysis of the vortex blob method for the $b$-equation. Discrete & Continuous Dynamical Systems - A, 2014, 34 (5) : 1995-2011. doi: 10.3934/dcds.2014.34.1995

[16]

Matania Ben–Artzi, Joseph Falcovitz, Jiequan Li. The convergence of the GRP scheme. Discrete & Continuous Dynamical Systems - A, 2009, 23 (1&2) : 1-27. doi: 10.3934/dcds.2009.23.1

[17]

Xueyong Wang, Yiju Wang, Gang Wang. An accelerated augmented Lagrangian method for multi-criteria optimization problem. Journal of Industrial & Management Optimization, 2017, 13 (5) : 1-9. doi: 10.3934/jimo.2018136

[18]

Fernando P. da Costa, João T. Pinto, Rafael Sasportes. On the convergence to critical scaling profiles in submonolayer deposition models. Kinetic & Related Models, 2018, 11 (6) : 1359-1376. doi: 10.3934/krm.2018053

[19]

Rajesh Kumar, Jitendra Kumar, Gerald Warnecke. Convergence analysis of a finite volume scheme for solving non-linear aggregation-breakage population balance equations. Kinetic & Related Models, 2014, 7 (4) : 713-737. doi: 10.3934/krm.2014.7.713

[20]

David Bourne, Howard Elman, John E. Osborn. A Non-Self-Adjoint Quadratic Eigenvalue Problem Describing a Fluid-Solid Interaction Part II: Analysis of Convergence. Communications on Pure & Applied Analysis, 2009, 8 (1) : 143-160. doi: 10.3934/cpaa.2009.8.143

2018 Impact Factor: 1.025

Metrics

  • PDF downloads (27)
  • HTML views (275)
  • Cited by (0)

Other articles
by authors

[Back to Top]