April  2017, 13(2): 857-872. doi: 10.3934/jimo.2016050

A memory gradient method based on the nonmonotone technique

Department of Applied Mathematics, Hainan University, Haikou 570228, China

* Corresponding author: Yigui Ou

Received  August 2014 Revised  March 2016 Published  August 2016

Fund Project: The work is supported by NNSF of China (No. 11261015) and NSF of Hainan Province (No. 2016CXTD004; No. 111001)

In this paper, we present a new nonmonotone memory gradient algorithm for unconstrained optimization problems. An attractive property of the proposed method is that the search direction always provides sufficient descent step at each iteration. This property is independent of the line search used. Under mild assumptions, the global and local convergence results of the proposed algorithm are established respectively. Numerical results are also reported to show that the proposed method is suitable to solve large-scale optimization problems and is more stable than other similar methods in practical computation.

Citation: Yigui Ou, Yuanwen Liu. A memory gradient method based on the nonmonotone technique. Journal of Industrial & Management Optimization, 2017, 13 (2) : 857-872. doi: 10.3934/jimo.2016050
References:
[1]

N. Andrei, An unconstrained optimization test functions, Advanced Modelling and Optimization, 10 (2008), 147-161. Google Scholar

[2]

J. Barzilai and J. M. Borwein, Two-point step size gradient methods, IMA Journal of Numerical Analysis, 8 (1988), 141-148. doi: 10.1093/imanum/8.1.141. Google Scholar

[3]

W. Y. Cheng and Q. F. Liu, Sufficient descent nonlinear conjugate gradient methods with conjugacy condition, Numerical Algorithms, 53 (2010), 113-131. doi: 10.1007/s11075-009-9318-8. Google Scholar

[4]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330. doi: 10.1023/A:1013653923062. Google Scholar

[5]

Y. H. Dai and Y. X. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM Journal on Optimization, 10 (1999), 177-182. doi: 10.1137/S1052623497318992. Google Scholar

[6]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.Google Scholar

[7]

N. Y. DengY. Xiao and F. J. Zhou, Nonmonotone trust region algorithm, Journal of Optimization Theory and Applications, 76 (1993), 259-285. doi: 10.1007/BF00939608. Google Scholar

[8]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263. Google Scholar

[9]

J. C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization, 2 (1992), 21-42. doi: 10.1137/0802003. Google Scholar

[10]

N. I. M. GouldD. Orban and Ph. L. Toint, CUTEr: a constrained and unconstrained testing environment, revisited, ACM Transactions on Mathematical Software, 29 (2003), 353-372. doi: 10.1145/962437.962438. Google Scholar

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716. doi: 10.1137/0723046. Google Scholar

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172. doi: 10.1016/j.camwa.2007.08.038. Google Scholar

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880. Google Scholar

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. Google Scholar

[15]

H. Iiduka and Y. Narushima, Conjugate gradient methods using value of objective function for unconstrained optimization, Optimization Letters, 6 (2012), 941-955. doi: 10.1007/s11590-011-0324-0. Google Scholar

[16]

Y. JiY. J. LiK. C. Zhang and X. L. Zhang, A new nonmonotone trust region method for conic model for solving unconstrained optimization, Journal of Computational and Applied Mathematics, 233 (2010), 1746-1754. doi: 10.1016/j.cam.2009.09.011. Google Scholar

[17]

J. T. MoK. C. Zhang and Z. X. Wei, A nonmonotone trust region method for unconstrained optimization, Applied Mathematics and Computation, 171 (2005), 371-384. doi: 10.1016/j.amc.2005.01.048. Google Scholar

[18]

Y. Narushima and H. Yabe, Global convergence of a memory gradient method for unconstrained optimization, Computational Optimization and Applications, 35 (2006), 325-346. doi: 10.1007/s10589-006-8719-z. Google Scholar

[19]

Y. NarushimaH. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21 (2011), 212-230. doi: 10.1137/080743573. Google Scholar

[20]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874. Google Scholar

[21]

Y. G. Ou and G. S. Wang, A new supermemory gradient method for unconstrained optimization problems, Optimization Letters, 6 (2012), 975-992. doi: 10.1007/s11590-011-0328-9. Google Scholar

[22]

Y. G. Ou and Q. Zhou, A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming, Applied Mathematics and Computation, 215 (2009), 474-480. doi: 10.1016/j.amc.2009.05.009. Google Scholar

[23]

M. Raydan, The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM Journal on Optimization, 7 (1977), 26-33. doi: 10.1137/S1052623494266365. Google Scholar

[24]

Z. J. Shi and J. Shen, A new supermemory gradient method with curve search rule, Applied Mathematics and Computation, 170 (2005), 1-16. doi: 10.1016/j.amc.2004.10.063. Google Scholar

[25]

Z. J. Shi and J. Shen, On memory gradient method with trust region for unconstrained optimization, Numerical Algorithms, 41 (2006), 173-196. doi: 10.1007/s11075-005-9008-0. Google Scholar

[26]

Z. J. ShiS. Q. Wang and Z. W. Xu, The convergence of conjugate gradient method with nonmonotone line search, Applied Mathematics and Computation, 217 (2010), 1921-1932. doi: 10.1016/j.amc.2010.06.047. Google Scholar

[27]

M. Sun and Q. G. Bai, A new descent memory gradient method and its global convergence, Journal of System Science and Complexity, 24 (2011), 784-794. doi: 10.1007/s11424-011-8150-0. Google Scholar

[28]

W. Y. Sun, Nonmonotone trust region method for solving optimization problems, Applied Mathematics and Computation, 156 (2004), 159-174. doi: 10.1016/j.amc.2003.07.008. Google Scholar

[29]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. Google Scholar

[30]

W. Y. Sun and Q. Y. Zhou, An unconstrained optimization method using nonmonotone second order Goldstein's linesearch, Science in China (Series A): Mathematics, 50 (2007), 1389-1400. doi: 10.1007/s11425-007-0072-x. Google Scholar

[31]

Ph. L. Toint, An assessment of nonmonotone line search techniques for unconstrained optimization, SIAM Journal on Scientific and Statistical Computing, 17 (1996), 725-739. doi: 10.1137/S106482759427021X. Google Scholar

[32]

Z. S. YuW. G. Zhang and B. F. Wu, Strong global convergence of an adaptive nonmonotone memory gradient method, Applied Mathematics and Computation, 185 (2007), 681-688. doi: 10.1016/j.amc.2006.07.075. Google Scholar

[33]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Journal on Optimization, 14 (2004), 1043-1056. doi: 10.1137/S1052623403428208. Google Scholar

[34]

L. ZhangW. J. Zhou and D. H. Li, A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006), 629-640. doi: 10.1093/imanum/drl016. Google Scholar

[35]

Y. Zheng and Z. P. Wan, A new variant of the memory gradient method for unconstrained optimization, Optimization Letters, 6 (2012), 1643-1655. doi: 10.1007/s11590-011-0355-6. Google Scholar

[36]

W. J. Zhou and L. Zhang, Global convergence of the nonmonotone MBFGS method for nonconvex unconstrained minimization, Journal of Computational and Applied Mathematics, 223 (2009), 40-47. doi: 10.1016/j.cam.2007.12.011. Google Scholar

show all references

References:
[1]

N. Andrei, An unconstrained optimization test functions, Advanced Modelling and Optimization, 10 (2008), 147-161. Google Scholar

[2]

J. Barzilai and J. M. Borwein, Two-point step size gradient methods, IMA Journal of Numerical Analysis, 8 (1988), 141-148. doi: 10.1093/imanum/8.1.141. Google Scholar

[3]

W. Y. Cheng and Q. F. Liu, Sufficient descent nonlinear conjugate gradient methods with conjugacy condition, Numerical Algorithms, 53 (2010), 113-131. doi: 10.1007/s11075-009-9318-8. Google Scholar

[4]

Y. H. Dai, On the nonmonotone line search, Journal of Optimization Theory and Applications, 112 (2002), 315-330. doi: 10.1023/A:1013653923062. Google Scholar

[5]

Y. H. Dai and Y. X. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM Journal on Optimization, 10 (1999), 177-182. doi: 10.1137/S1052623497318992. Google Scholar

[6]

Y. H. Dai and Y. X. Yuan, Nonlinear Conjugate Gradient Methods (in chinese), Shanghai Scientific and Technical Publishers, Shanghai, 2000.Google Scholar

[7]

N. Y. DengY. Xiao and F. J. Zhou, Nonmonotone trust region algorithm, Journal of Optimization Theory and Applications, 76 (1993), 259-285. doi: 10.1007/BF00939608. Google Scholar

[8]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213. doi: 10.1007/s101070100263. Google Scholar

[9]

J. C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization, 2 (1992), 21-42. doi: 10.1137/0802003. Google Scholar

[10]

N. I. M. GouldD. Orban and Ph. L. Toint, CUTEr: a constrained and unconstrained testing environment, revisited, ACM Transactions on Mathematical Software, 29 (2003), 353-372. doi: 10.1145/962437.962438. Google Scholar

[11]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM Journal on Numerical Analysis, 23 (1986), 707-716. doi: 10.1137/0723046. Google Scholar

[12]

N. Z. Gu and J. T. Mo, Incorporating nonmonotone strategies into the trust region method for unconstrained optimization, Computers and Mathematics with Applications, 55 (2008), 2158-2172. doi: 10.1016/j.camwa.2007.08.038. Google Scholar

[13]

W. W. Hager and H. C. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192. doi: 10.1137/030601880. Google Scholar

[14]

W. W. Hager and H. C. Zhang, A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58. Google Scholar

[15]

H. Iiduka and Y. Narushima, Conjugate gradient methods using value of objective function for unconstrained optimization, Optimization Letters, 6 (2012), 941-955. doi: 10.1007/s11590-011-0324-0. Google Scholar

[16]

Y. JiY. J. LiK. C. Zhang and X. L. Zhang, A new nonmonotone trust region method for conic model for solving unconstrained optimization, Journal of Computational and Applied Mathematics, 233 (2010), 1746-1754. doi: 10.1016/j.cam.2009.09.011. Google Scholar

[17]

J. T. MoK. C. Zhang and Z. X. Wei, A nonmonotone trust region method for unconstrained optimization, Applied Mathematics and Computation, 171 (2005), 371-384. doi: 10.1016/j.amc.2005.01.048. Google Scholar

[18]

Y. Narushima and H. Yabe, Global convergence of a memory gradient method for unconstrained optimization, Computational Optimization and Applications, 35 (2006), 325-346. doi: 10.1007/s10589-006-8719-z. Google Scholar

[19]

Y. NarushimaH. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21 (2011), 212-230. doi: 10.1137/080743573. Google Scholar

[20]

J. Nocedal and S. J. Wright, Numerical Optimization, Springer, New York, 1999. doi: 10.1007/b98874. Google Scholar

[21]

Y. G. Ou and G. S. Wang, A new supermemory gradient method for unconstrained optimization problems, Optimization Letters, 6 (2012), 975-992. doi: 10.1007/s11590-011-0328-9. Google Scholar

[22]

Y. G. Ou and Q. Zhou, A nonmonotonic trust region algorithm for a class of semi-infinite minimax programming, Applied Mathematics and Computation, 215 (2009), 474-480. doi: 10.1016/j.amc.2009.05.009. Google Scholar

[23]

M. Raydan, The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM Journal on Optimization, 7 (1977), 26-33. doi: 10.1137/S1052623494266365. Google Scholar

[24]

Z. J. Shi and J. Shen, A new supermemory gradient method with curve search rule, Applied Mathematics and Computation, 170 (2005), 1-16. doi: 10.1016/j.amc.2004.10.063. Google Scholar

[25]

Z. J. Shi and J. Shen, On memory gradient method with trust region for unconstrained optimization, Numerical Algorithms, 41 (2006), 173-196. doi: 10.1007/s11075-005-9008-0. Google Scholar

[26]

Z. J. ShiS. Q. Wang and Z. W. Xu, The convergence of conjugate gradient method with nonmonotone line search, Applied Mathematics and Computation, 217 (2010), 1921-1932. doi: 10.1016/j.amc.2010.06.047. Google Scholar

[27]

M. Sun and Q. G. Bai, A new descent memory gradient method and its global convergence, Journal of System Science and Complexity, 24 (2011), 784-794. doi: 10.1007/s11424-011-8150-0. Google Scholar

[28]

W. Y. Sun, Nonmonotone trust region method for solving optimization problems, Applied Mathematics and Computation, 156 (2004), 159-174. doi: 10.1016/j.amc.2003.07.008. Google Scholar

[29]

W. Y. Sun and Y. X. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. Google Scholar

[30]

W. Y. Sun and Q. Y. Zhou, An unconstrained optimization method using nonmonotone second order Goldstein's linesearch, Science in China (Series A): Mathematics, 50 (2007), 1389-1400. doi: 10.1007/s11425-007-0072-x. Google Scholar

[31]

Ph. L. Toint, An assessment of nonmonotone line search techniques for unconstrained optimization, SIAM Journal on Scientific and Statistical Computing, 17 (1996), 725-739. doi: 10.1137/S106482759427021X. Google Scholar

[32]

Z. S. YuW. G. Zhang and B. F. Wu, Strong global convergence of an adaptive nonmonotone memory gradient method, Applied Mathematics and Computation, 185 (2007), 681-688. doi: 10.1016/j.amc.2006.07.075. Google Scholar

[33]

H. C. Zhang and W. W. Hager, A nonmonotone line search technique and its application to unconstrained optimization, SIAM Journal on Optimization, 14 (2004), 1043-1056. doi: 10.1137/S1052623403428208. Google Scholar

[34]

L. ZhangW. J. Zhou and D. H. Li, A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006), 629-640. doi: 10.1093/imanum/drl016. Google Scholar

[35]

Y. Zheng and Z. P. Wan, A new variant of the memory gradient method for unconstrained optimization, Optimization Letters, 6 (2012), 1643-1655. doi: 10.1007/s11590-011-0355-6. Google Scholar

[36]

W. J. Zhou and L. Zhang, Global convergence of the nonmonotone MBFGS method for nonconvex unconstrained minimization, Journal of Computational and Applied Mathematics, 223 (2009), 40-47. doi: 10.1016/j.cam.2007.12.011. Google Scholar

Figure 1.  Performance profile based on CPU time
Table 1.  Numerical results for five algorithms
Fun. n FR PRP+ DY HZ Algor.2.1
cpu(s) cpu(s) cpu(s) cpu(s) cpu(s)
Ex.F.Roth 104 4.888547 0.550768 2.957423 0.521059 1.823025
105 35.695368 6.393457 25.041071 5.640529 18.592210
106 287.654246 48.039330 228.797928 46.116688 220.881878
Ex.Trig. 104 6.477725 0.879721 4.214142 1.521069 2.249725
105 F 12.156054 116.274122 74.643352 13.436983
106 F 137.648005 209.220891 159.74406 140.003615
Ex.Rosen. 104 0.937497 0.334996 1.779752 0.269760 1.198996
105 3.666262 2.938029 19.050805 2.371920 12.510256
106 76.992755 28.496619 154.785583 19.094775 100.920667
Ex.W.Holst 104 0.927501 0.902176 0.953671 0.641670 0.667595
105 11.200851 6.359390 5.016512 5.216413 5.085647
106 108.520029 63.355358 53.428723 55.247560 52.571641
Ex.Beale 104 0.324522 0.275613 0.418340 0.157054 0.228633
105 2.474278 1.870461 2.407946 1.499047 2.405817
106 23.457059 19.579839 26.324518 15.640053 24.912454
Ex.Penalty 104 0.275020 0.201060 0.208781 0.176900 0.203811
105 1.234828 2.239090 1.827919 1.724014 1.821343
106 9.546747 17.71788 27.437794 14.134732 16.033043
Per.Quad. 104 3.02902 3.077180 3.040349 3.358025 46.029533
105 209.611665 103.424010 114.210432 108.280599 F
Raydan 1 104 5.081648 3.277429 4.728205 3.713061 11.254018
105 97.715159 46.250008 127.696125 65.739349 141.324543
Raydan 2 104 0.109849 0.087747 0.228718 0.057772 0.057323
105 0.833817 0.494746 1.978097 0.504858 0.533447
106 8.214171 4.698048 16.906394 4.956234 5.812667
Diag.1 104 5.789406 3.827681 6.911255 3.708083 3.538550
105 F 76.219241 F 70.737746 70.438309
Diag.2 104 5.678352 F 5.285281 F 5.166993
105 154.534720 F 152.123264 F 145.797863
Diag.3 104 6.181573 4.845072 6.228637 6.442592 5.311859
105 57.195661 52.810276 55.066374 48.976185 52.161715
Hager 104 1.380050 0.721078 1.358417 0.732179 1.320932
105 22.745343 11.942894 25.253252 11.895805 22.043234
106 F 122.218850 F 156.087713 187.476220
Gen.Trid.1 104 2.445174 2.512342 2.528296 2.258533 1.755517
105 25.293375 21.644389 25.202319 24.239381 20.201541
106 234.095747 254.975842 204.126831 197.991582 145.845123
Ex.Trid.1 104 6.224192 17.140057 1.533758 1.532657 13.182526
105 143.494302 244.260502 16.205910 14.276184 133.624706
Ex.Three.Exp. 104 0.199302 0.225054 0.198419 0.113357 0.142117
105 2.162488 2.405353 1.905207 1.265936 1.468474
106 18.841829 21.440576 15.564795 10.447416 14.452928
Gen.Trid.2 104 13.541982 3.192677 4.418622 2.970148 2.529354
105 F 30.674646 F 24.129818 23.986342
Diag.4 104 0.094771 0.057988 0.114438 0.062960 0.093890
105 1.202458 0.388207 1.013161 0.202075 0.890367
106 19.755904 3.749717 8.999546 1.906101 8.861827
Diag.5 104 0.217310 0.089796 0.238359 0.141571 0.083229
105 1.373886 0.744259 2.028916 1.963680 0.693929
106 13.342298 8.012713 18.812321 18.810192 6.833566
Ex.Himm. 104 0.611196 0.536196 0.141640 0.087034 0.132999
105 5.446782 5.385943 1.156565 0.724150 1.104406
106 53.854410 53.065089 10.564224 7.176464 11.838695
Gen.PSC1 104 2.934187 1.250311 11.817101 2.056868 28.300200
105 7.925998 15.507716 57.672689 14.110938 98.215964
Ex.PSC1 104 0.114309 0.116711 0.137112 0.121277 0.120501
105 1.172552 0.942523 1.131255 1.341338 1.095562
106 13.738276 10.067906 9.391618 12.643046 10.754309
Ex.B.Diag.BD1 104 0.481994 0.093661 7.119658 0.153198 0.184555
105 4.780717 0.828624 63.779966 1.219591 1.886973
106 45.244221 7.678318 F 13.094154 17.652279
Ex.Maratos 104 7.298980 0.215135 7.417174 0.400084 1.720770
105 72.871313 2.399795 69.118730 2.917748 15.965102
106 F 24.968870 F 37.905524 188.771631
Qua.Diag.Pert. 104 20.021639 20.021639 23.190386 6.862350 20.0167895
Ex.Wood 104 3.626209 0.449196 2.218795 0.481428 1.548869
105 30.297717 5.710406 25.452955 5.599862 24.178781
106 F 83.358482 F 50.926588 80.176211
Ex.Hiebert 104 12.325250 0.423625 4.209053 0.505931 4.111580
105 61.420777 7.301559 44.310328 4.483430 43.492047
106 F 98.625677 F 40.245693 F
Ex.Qua.Pena.QP1 104 0.209738 0.131130 0.120703 0.213959 0.120383
105 0.931057 2.044439 0.928968 1.702844 1.680728
106 19.710059 13.914293 5.142260 6.478819 13.674789
Ex.Qua.Pena.QP2 104 F 0.368244 7.436318 0.329816 1.646625
105 F 3.211811 214.833356 3.229403 14.055916
106 F 37.820369 F 31.182550 183.466287
Qua.QF2 104 6.548727 6.315666 10.817725 6.994623 6.088805
FLETCBV3 1000 3.391656 3.208908 3.322490 9.722141 10.174606
FLETCHCR 5000 17.258755 14.429972 16.787908 14.802347 10.449042
BDQRTIC 104 83.808804 F 11.207452 9.377903 75.611094
TRIDIA 104 20.207208 7.295958 13.794344 31.899388 85.858224
ARGLINB 104 0.110820 0.107058 0.077671 0.068173 0.104807
105 0.722081 5.972367 0.946074 0.622368 0.790848
106 7.334012 8.595424 6.361517 13.706507 5.689481
ARWHEAD 104 0.370001 0.077944 0.132404 0.090989 0.101133
105 1.579815 0.563831 1.544622 1.490355 1.243687
106 5.837702 5.826076 4.586703 6.565910 5.671701
NONBIA 104 0.537552 0.102890 1.058223 0.103249 0.8355213
105 2.747465 1.337234 2.297892 2.992103 3.579316
106 5.671371 22.742947 3.420499 6.936500 29.025324
NONDQUAR 104 0.442060 0.190346 0.158307 0.400302 0.649100
105 0.796334 2.503826 0.704471 1.828935 7.990690
106 5.778999 20.906151 4.532799 17.272681 67.882148
EG2 104 0.994080 0.313402 1.554111 1.337630 2.278667
105 7.160837 35.861218 3.615775 7.705992 10.859091
106 43.235295 14.166499 11.586350 10.875278 50.416843
DIXMAANA 3*104 1.271498 0.762316 1.282368 0.736608 0.756586
3*105 11.612663 10.441716 11.562110 8.106027 7.118575
3*106 114.760219 102.835981 125.107882 79.990697 75.923830
DIXMAANB 3*104 0.517920 0.385757 0.542675 0.537647 0.354278
3*105 4.638275 7.027447 9.318548 7.237380 4.477140
3*106 82.225445 70.181132 91.019613 71.300978 43.957930
DIXMAANC 3*104 1.841306 0.543875 1.021507 0.939805 0.525989
3*105 16.686708 5.288750 9.791455 8.450863 5.193175
3*106 218.619095 51.116314 98.113856 83.487215 50.139642
DIXMAAND 3*104 1.368970 0.791623 1.046740 0.935925 0.779501
3*105 12.372986 7.759932 10.334092 9.122740 7.526088
3*106 120.560906 114.310394 101.432738 89.735515 83.011910
DIXMAANE 3*104 32.531045 F 34.405608 47.489975 296.898199
DIXMAANF 3*104 35.843660 F 29.699410 62.706209 283.669710
DIXMAANG 3*104 26.189755 F 36.058009 36.637936 293.464782
DIXMAANH 3*104 47.59872 F 25.163339 33.846983 227.372189
DIXMAANI 3*104 108.335661 F 139.441850 199.234093 271.533107
DIXMAANJ 3*104 33.383252 F 51.195703 46.553180 207.800703
DIXMAANK 3*104 29.999585 F 48.466471 48.972313 193.059611
DIXMAANL 3*104 29.847873 F 46.188729 24.687342 240.765512
LIARWHD 104 0.293234 0.167451 0.602854 0.194428 1.350052
105 23.934621 5.071408 4.936705 1.784358 20.476402
106 F 17.774764 84.730927 20.451442 286.401251
ENGVAL1 104 2.945172 1.918970 2.481500 1.763492 2.291727
105 25.852729 21.748152 16.928397 14.810821 14.884762
106 165.309436 138.481046 194.509697 198.810671 190.808154
EDENSCH 104 0.847047 0.669810 0.844103 0.561064 0.506037
105 7.170456 6.061741 7.142879 5.470726 5.411347
106 60.312014 53.737460 70.688592 46.010424 51.866055
VARDIM 104 0.208694 0.744199 0.206950 0.186933 0.177388
105 1.870152 1.437232 1.864668 2.162191 2.020566
106 28.081020 23.333807 27.601484 22.723724 22.667290
QUARTC 104 105.093070 50.764779 4.301317 54.501230 0.196144
105 F F 63.741145 F 3.083412
106 F F F F 50.399742
SINQUAD 5000 31.929046 49.651847 131.459433 29.357854 119.515009
DENSCHNB 104 0.053052 0.039400 0.044617 0.042022 0.039368
105 0.331962 0.373278 0.470038 0.411125 0.381222
106 3.100415 3.535380 5.128739 4.312925 3.585630
DENSCHNF 104 0.099564 0.084990 0.105827 0.120146 0.111866
105 0.904616 0.758068 0.936884 1.111403 1.071840
106 9.215575 8.414838 9.944450 11.481290 10.755364
COSINE 104 0.172759 0.482814 0.222049 0.133286 0.170851
105 1.422683 4.501162 1.686364 1.103350 1.410227
106 13.332279 39.732100 17.426555 10.334436 13.239945
Fun. n FR PRP+ DY HZ Algor.2.1
cpu(s) cpu(s) cpu(s) cpu(s) cpu(s)
Ex.F.Roth 104 4.888547 0.550768 2.957423 0.521059 1.823025
105 35.695368 6.393457 25.041071 5.640529 18.592210
106 287.654246 48.039330 228.797928 46.116688 220.881878
Ex.Trig. 104 6.477725 0.879721 4.214142 1.521069 2.249725
105 F 12.156054 116.274122 74.643352 13.436983
106 F 137.648005 209.220891 159.74406 140.003615
Ex.Rosen. 104 0.937497 0.334996 1.779752 0.269760 1.198996
105 3.666262 2.938029 19.050805 2.371920 12.510256
106 76.992755 28.496619 154.785583 19.094775 100.920667
Ex.W.Holst 104 0.927501 0.902176 0.953671 0.641670 0.667595
105 11.200851 6.359390 5.016512 5.216413 5.085647
106 108.520029 63.355358 53.428723 55.247560 52.571641
Ex.Beale 104 0.324522 0.275613 0.418340 0.157054 0.228633
105 2.474278 1.870461 2.407946 1.499047 2.405817
106 23.457059 19.579839 26.324518 15.640053 24.912454
Ex.Penalty 104 0.275020 0.201060 0.208781 0.176900 0.203811
105 1.234828 2.239090 1.827919 1.724014 1.821343
106 9.546747 17.71788 27.437794 14.134732 16.033043
Per.Quad. 104 3.02902 3.077180 3.040349 3.358025 46.029533
105 209.611665 103.424010 114.210432 108.280599 F
Raydan 1 104 5.081648 3.277429 4.728205 3.713061 11.254018
105 97.715159 46.250008 127.696125 65.739349 141.324543
Raydan 2 104 0.109849 0.087747 0.228718 0.057772 0.057323
105 0.833817 0.494746 1.978097 0.504858 0.533447
106 8.214171 4.698048 16.906394 4.956234 5.812667
Diag.1 104 5.789406 3.827681 6.911255 3.708083 3.538550
105 F 76.219241 F 70.737746 70.438309
Diag.2 104 5.678352 F 5.285281 F 5.166993
105 154.534720 F 152.123264 F 145.797863
Diag.3 104 6.181573 4.845072 6.228637 6.442592 5.311859
105 57.195661 52.810276 55.066374 48.976185 52.161715
Hager 104 1.380050 0.721078 1.358417 0.732179 1.320932
105 22.745343 11.942894 25.253252 11.895805 22.043234
106 F 122.218850 F 156.087713 187.476220
Gen.Trid.1 104 2.445174 2.512342 2.528296 2.258533 1.755517
105 25.293375 21.644389 25.202319 24.239381 20.201541
106 234.095747 254.975842 204.126831 197.991582 145.845123
Ex.Trid.1 104 6.224192 17.140057 1.533758 1.532657 13.182526
105 143.494302 244.260502 16.205910 14.276184 133.624706
Ex.Three.Exp. 104 0.199302 0.225054 0.198419 0.113357 0.142117
105 2.162488 2.405353 1.905207 1.265936 1.468474
106 18.841829 21.440576 15.564795 10.447416 14.452928
Gen.Trid.2 104 13.541982 3.192677 4.418622 2.970148 2.529354
105 F 30.674646 F 24.129818 23.986342
Diag.4 104 0.094771 0.057988 0.114438 0.062960 0.093890
105 1.202458 0.388207 1.013161 0.202075 0.890367
106 19.755904 3.749717 8.999546 1.906101 8.861827
Diag.5 104 0.217310 0.089796 0.238359 0.141571 0.083229
105 1.373886 0.744259 2.028916 1.963680 0.693929
106 13.342298 8.012713 18.812321 18.810192 6.833566
Ex.Himm. 104 0.611196 0.536196 0.141640 0.087034 0.132999
105 5.446782 5.385943 1.156565 0.724150 1.104406
106 53.854410 53.065089 10.564224 7.176464 11.838695
Gen.PSC1 104 2.934187 1.250311 11.817101 2.056868 28.300200
105 7.925998 15.507716 57.672689 14.110938 98.215964
Ex.PSC1 104 0.114309 0.116711 0.137112 0.121277 0.120501
105 1.172552 0.942523 1.131255 1.341338 1.095562
106 13.738276 10.067906 9.391618 12.643046 10.754309
Ex.B.Diag.BD1 104 0.481994 0.093661 7.119658 0.153198 0.184555
105 4.780717 0.828624 63.779966 1.219591 1.886973
106 45.244221 7.678318 F 13.094154 17.652279
Ex.Maratos 104 7.298980 0.215135 7.417174 0.400084 1.720770
105 72.871313 2.399795 69.118730 2.917748 15.965102
106 F 24.968870 F 37.905524 188.771631
Qua.Diag.Pert. 104 20.021639 20.021639 23.190386 6.862350 20.0167895
Ex.Wood 104 3.626209 0.449196 2.218795 0.481428 1.548869
105 30.297717 5.710406 25.452955 5.599862 24.178781
106 F 83.358482 F 50.926588 80.176211
Ex.Hiebert 104 12.325250 0.423625 4.209053 0.505931 4.111580
105 61.420777 7.301559 44.310328 4.483430 43.492047
106 F 98.625677 F 40.245693 F
Ex.Qua.Pena.QP1 104 0.209738 0.131130 0.120703 0.213959 0.120383
105 0.931057 2.044439 0.928968 1.702844 1.680728
106 19.710059 13.914293 5.142260 6.478819 13.674789
Ex.Qua.Pena.QP2 104 F 0.368244 7.436318 0.329816 1.646625
105 F 3.211811 214.833356 3.229403 14.055916
106 F 37.820369 F 31.182550 183.466287
Qua.QF2 104 6.548727 6.315666 10.817725 6.994623 6.088805
FLETCBV3 1000 3.391656 3.208908 3.322490 9.722141 10.174606
FLETCHCR 5000 17.258755 14.429972 16.787908 14.802347 10.449042
BDQRTIC 104 83.808804 F 11.207452 9.377903 75.611094
TRIDIA 104 20.207208 7.295958 13.794344 31.899388 85.858224
ARGLINB 104 0.110820 0.107058 0.077671 0.068173 0.104807
105 0.722081 5.972367 0.946074 0.622368 0.790848
106 7.334012 8.595424 6.361517 13.706507 5.689481
ARWHEAD 104 0.370001 0.077944 0.132404 0.090989 0.101133
105 1.579815 0.563831 1.544622 1.490355 1.243687
106 5.837702 5.826076 4.586703 6.565910 5.671701
NONBIA 104 0.537552 0.102890 1.058223 0.103249 0.8355213
105 2.747465 1.337234 2.297892 2.992103 3.579316
106 5.671371 22.742947 3.420499 6.936500 29.025324
NONDQUAR 104 0.442060 0.190346 0.158307 0.400302 0.649100
105 0.796334 2.503826 0.704471 1.828935 7.990690
106 5.778999 20.906151 4.532799 17.272681 67.882148
EG2 104 0.994080 0.313402 1.554111 1.337630 2.278667
105 7.160837 35.861218 3.615775 7.705992 10.859091
106 43.235295 14.166499 11.586350 10.875278 50.416843
DIXMAANA 3*104 1.271498 0.762316 1.282368 0.736608 0.756586
3*105 11.612663 10.441716 11.562110 8.106027 7.118575
3*106 114.760219 102.835981 125.107882 79.990697 75.923830
DIXMAANB 3*104 0.517920 0.385757 0.542675 0.537647 0.354278
3*105 4.638275 7.027447 9.318548 7.237380 4.477140
3*106 82.225445 70.181132 91.019613 71.300978 43.957930
DIXMAANC 3*104 1.841306 0.543875 1.021507 0.939805 0.525989
3*105 16.686708 5.288750 9.791455 8.450863 5.193175
3*106 218.619095 51.116314 98.113856 83.487215 50.139642
DIXMAAND 3*104 1.368970 0.791623 1.046740 0.935925 0.779501
3*105 12.372986 7.759932 10.334092 9.122740 7.526088
3*106 120.560906 114.310394 101.432738 89.735515 83.011910
DIXMAANE 3*104 32.531045 F 34.405608 47.489975 296.898199
DIXMAANF 3*104 35.843660 F 29.699410 62.706209 283.669710
DIXMAANG 3*104 26.189755 F 36.058009 36.637936 293.464782
DIXMAANH 3*104 47.59872 F 25.163339 33.846983 227.372189
DIXMAANI 3*104 108.335661 F 139.441850 199.234093 271.533107
DIXMAANJ 3*104 33.383252 F 51.195703 46.553180 207.800703
DIXMAANK 3*104 29.999585 F 48.466471 48.972313 193.059611
DIXMAANL 3*104 29.847873 F 46.188729 24.687342 240.765512
LIARWHD 104 0.293234 0.167451 0.602854 0.194428 1.350052
105 23.934621 5.071408 4.936705 1.784358 20.476402
106 F 17.774764 84.730927 20.451442 286.401251
ENGVAL1 104 2.945172 1.918970 2.481500 1.763492 2.291727
105 25.852729 21.748152 16.928397 14.810821 14.884762
106 165.309436 138.481046 194.509697 198.810671 190.808154
EDENSCH 104 0.847047 0.669810 0.844103 0.561064 0.506037
105 7.170456 6.061741 7.142879 5.470726 5.411347
106 60.312014 53.737460 70.688592 46.010424 51.866055
VARDIM 104 0.208694 0.744199 0.206950 0.186933 0.177388
105 1.870152 1.437232 1.864668 2.162191 2.020566
106 28.081020 23.333807 27.601484 22.723724 22.667290
QUARTC 104 105.093070 50.764779 4.301317 54.501230 0.196144
105 F F 63.741145 F 3.083412
106 F F F F 50.399742
SINQUAD 5000 31.929046 49.651847 131.459433 29.357854 119.515009
DENSCHNB 104 0.053052 0.039400 0.044617 0.042022 0.039368
105 0.331962 0.373278 0.470038 0.411125 0.381222
106 3.100415 3.535380 5.128739 4.312925 3.585630
DENSCHNF 104 0.099564 0.084990 0.105827 0.120146 0.111866
105 0.904616 0.758068 0.936884 1.111403 1.071840
106 9.215575 8.414838 9.944450 11.481290 10.755364
COSINE 104 0.172759 0.482814 0.222049 0.133286 0.170851
105 1.422683 4.501162 1.686364 1.103350 1.410227
106 13.332279 39.732100 17.426555 10.334436 13.239945
[1]

Lijuan Zhao, Wenyu Sun. Nonmonotone retrospective conic trust region method for unconstrained optimization. Numerical Algebra, Control & Optimization, 2013, 3 (2) : 309-325. doi: 10.3934/naco.2013.3.309

[2]

Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034

[3]

Zhong Wan, Chaoming Hu, Zhanlu Yang. A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search. Discrete & Continuous Dynamical Systems - B, 2011, 16 (4) : 1157-1169. doi: 10.3934/dcdsb.2011.16.1157

[4]

Rouhollah Tavakoli, Hongchao Zhang. A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numerical Algebra, Control & Optimization, 2012, 2 (2) : 395-412. doi: 10.3934/naco.2012.2.395

[5]

Wataru Nakamura, Yasushi Narushima, Hiroshi Yabe. Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (3) : 595-619. doi: 10.3934/jimo.2013.9.595

[6]

Nora Merabet. Global convergence of a memory gradient method with closed-form step size formula. Conference Publications, 2007, 2007 (Special) : 721-730. doi: 10.3934/proc.2007.2007.721

[7]

Jun Chen, Wenyu Sun, Zhenghao Yang. A non-monotone retrospective trust-region method for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (4) : 919-944. doi: 10.3934/jimo.2013.9.919

[8]

Leong-Kwan Li, Sally Shao. Convergence analysis of the weighted state space search algorithm for recurrent neural networks. Numerical Algebra, Control & Optimization, 2014, 4 (3) : 193-207. doi: 10.3934/naco.2014.4.193

[9]

Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control & Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024

[10]

Stefan Kindermann. Convergence of the gradient method for ill-posed problems. Inverse Problems & Imaging, 2017, 11 (4) : 703-720. doi: 10.3934/ipi.2017033

[11]

Changjun Yu, Kok Lay Teo, Liansheng Zhang, Yanqin Bai. On a refinement of the convergence analysis for the new exact penalty function method for continuous inequality constrained optimization problem. Journal of Industrial & Management Optimization, 2012, 8 (2) : 485-491. doi: 10.3934/jimo.2012.8.485

[12]

Matthias Gerdts, Stefan Horn, Sven-Joachim Kimmerle. Line search globalization of a semismooth Newton method for operator equations in Hilbert spaces with applications in optimal control. Journal of Industrial & Management Optimization, 2017, 13 (1) : 47-62. doi: 10.3934/jimo.2016003

[13]

Mohamed Aly Tawhid. Nonsmooth generalized complementarity as unconstrained optimization. Journal of Industrial & Management Optimization, 2010, 6 (2) : 411-423. doi: 10.3934/jimo.2010.6.411

[14]

C.Y. Wang, M.X. Li. Convergence property of the Fletcher-Reeves conjugate gradient method with errors. Journal of Industrial & Management Optimization, 2005, 1 (2) : 193-200. doi: 10.3934/jimo.2005.1.193

[15]

Yu-Ning Yang, Su Zhang. On linear convergence of projected gradient method for a class of affine rank minimization problems. Journal of Industrial & Management Optimization, 2016, 12 (4) : 1507-1519. doi: 10.3934/jimo.2016.12.1507

[16]

Mingyong Lai, Xiaojiao Tong. A metaheuristic method for vehicle routing problem based on improved ant colony optimization and Tabu search. Journal of Industrial & Management Optimization, 2012, 8 (2) : 469-484. doi: 10.3934/jimo.2012.8.469

[17]

Wenqing Hu, Chris Junchi Li. A convergence analysis of the perturbed compositional gradient flow: Averaging principle and normal deviations. Discrete & Continuous Dynamical Systems - A, 2018, 38 (10) : 4951-4977. doi: 10.3934/dcds.2018216

[18]

Yong Duan, Jian-Guo Liu. Convergence analysis of the vortex blob method for the $b$-equation. Discrete & Continuous Dynamical Systems - A, 2014, 34 (5) : 1995-2011. doi: 10.3934/dcds.2014.34.1995

[19]

El-Sayed M.E. Mostafa. A nonlinear conjugate gradient method for a special class of matrix optimization problems. Journal of Industrial & Management Optimization, 2014, 10 (3) : 883-903. doi: 10.3934/jimo.2014.10.883

[20]

Jueyou Li, Guoquan Li, Zhiyou Wu, Changzhi Wu, Xiangyu Wang, Jae-Myung Lee, Kwang-Hyo Jung. Incremental gradient-free method for nonsmooth distributed optimization. Journal of Industrial & Management Optimization, 2017, 13 (4) : 1841-1857. doi: 10.3934/jimo.2017021

2018 Impact Factor: 1.025

Article outline

Figures and Tables

[Back to Top]