
-
Previous Article
Multi-machine scheduling with interval constrained position-dependent processing times
- JIMO Home
- This Issue
-
Next Article
Dispersion with connectivity in wireless mesh networks
A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
Department of Applied Mathematics, Hainan University, Haikou 570228, China |
This paper presents a nonmonotone scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving nonsmooth convex optimization problems, which combines the idea of scaled memoryless BFGS preconditioned conjugate gradient method with the nonmonotone technique and the Moreau-Yosida regularization. The proposed method makes use of approximate function and gradient values of the Moreau-Yosida regularization instead of the corresponding exact values. Under mild conditions, the global convergence of the proposed method is established. Preliminary numerical results and related comparisons show that the proposed method can be applied to solve large scale nonsmooth convex optimization problems.
References:
[1] |
N. Andrei,
Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416.
doi: 10.1007/s10589-007-9055-7. |
[2] |
A. Auslender,
Numerical methods for nondifferentiable convex optimization, Mathematical Programming Study, 30 (1987), 102-126.
|
[3] |
S. Babaie-Kafaki,
A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization, 4OR, A Quarterly Journal of Operations Research, 11 (2013), 361-374.
doi: 10.1007/s10288-013-0233-4. |
[4] |
S. Babaie-Kafaki and R. Chanbari,
A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, Journal of Industrial and Management Optimization, 13 (2017), 649-658.
doi: 10.3934/jimo.2016038. |
[5] |
J. Barzilai and J. M. Borwein,
Two-point stepsize gradient methods, IMA Journal of Numerical Analysis, 8 (1988), 141-148.
|
[6] |
E. Birgin and J. M. Martínez,
A spectral conjugate gradient method for unconstrained optimization, Applied Mathematics and Optimization, 43 (2001), 117-128.
doi: 10.1007/s00245-001-0003-0. |
[7] |
J. F. Bonnans, J. C. Gilbert, C. Lemarechal and C. Sagastizabal,
A family of variable-metric proximal methods, Mathematical Programming, 68 (1995), 15-47.
doi: 10.1007/BF01585756. |
[8] |
J. V. Burke and M. Qian,
On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating, Mathematical Programming, 88 (2000), 157-181.
doi: 10.1007/PL00011373. |
[9] |
X. Chen and M. Fukushima,
Proximal quasi-Newton methods for nondifferentiable convex optimization, Mathematical Programming, 85 (1999), 313-334.
doi: 10.1007/s101070050059. |
[10] |
E. D. Dolan and J. J. Moré,
Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213.
doi: 10.1007/s101070100263. |
[11] |
M. Fukushima,
A descent algorithm for nonsmooth convex optimization, Mathematical Programming, 30 (1984), 163-175.
doi: 10.1007/BF02591883. |
[12] |
M. Fukushima and L. Q. Qi,
A globally and superlinearly convergent algorithm for nonsmooth convex minimization, SIAM Journal on Optimization, 6 (1996), 1106-1120.
doi: 10.1137/S1052623494278839. |
[13] |
M. Haarala, K. Miettinen and M. M. Mäkelä,
New limited memory bundle method for large-scale nonsmooth optimization, Optimization Methods and Software, 19 (2004), 673-692.
doi: 10.1080/10556780410001689225. |
[14] |
M. Haarala, K. Miettinen and M. M. Mäkelä,
Globally convergent limited memory bundle method for large-scale nonsmooth optimization, Mathematical Programming, 109 (2007), 181-205.
doi: 10.1007/s10107-006-0728-2. |
[15] |
W. W. Hager and H. C. Zhang,
A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58.
|
[16] |
J. B. Hiriart-Urruty and C. Lemaréchal, Convex Analysis and Minimization Algorithms, Springer, Berlin, 1993.
doi: 10.1007/978-3-662-02796-7. |
[17] |
C. Lemarechal and C. Sagastizabal,
Practical aspects of the Moreau-Yosida regularization, I: Theoretical preliminaries, SIAM Journal on Optimization, 7 (1997), 367-385.
|
[18] |
Q. Li,
Conjugate gradient type methods for the nondifferentiable convex minimization, Optimization Letters, 7 (2013), 533-545.
|
[19] |
D. H. Li and M. Fukushima,
A derivative-free line search and global convergence of Broyden-like method for nonlinear equations, Optimization Methods and Software, 13 (2000), 181-201.
|
[20] |
S. Lu, Z. X. Wei and L. Li,
A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization, Computational Optimization and Applications, 51 (2012), 551-573.
|
[21] |
L. Luk$\check{s}$an and J. Vl$\check{c}$ek, Test Problems for Nonsmooth Unconstrained and Linearly Constrained Optimization, Technical Report No. 798, Institute of Computer Science, Academy of Sciences of the Czech Republic, 2000. |
[22] |
R. Mifflin,
A quasi-second-order proximal bundle algorithm, Mathematical Programming, 73 (1996), 51-72.
|
[23] |
Y. G. Ou and H. C. Lin,
An ODE-like nonmonotone method for nonsmooth convex optimization, Journal of Applied Mathematics and Computing, 52 (2016), 265-285.
|
[24] |
L. Q. Qi,
Convergence analysis of some algorithms for solving nonsmooth equations, Mathematics of Operations Research, 18 (1993), 227-244.
doi: 10.1287/moor.18.1.227. |
[25] |
A. I. Rauf and M. Fukushima,
A globally convergent BFGS method for nonsmooth convex optimization, Journal of Optimization Theory and Applications, 104 (2000), 539-558.
|
[26] |
N. Sagara and M. Fukushima,
A trust region method for nonsmooth convex optimization, Journal of Industrial and Management Optimization, 1 (2005), 171-180.
|
[27] |
D. F. Shanno,
On the convergence of a new conjugate gradient algorithm, SIAM Journal on Numerical Analysis, 15 (1978), 1247-1257.
|
[28] |
J. Shen, L. P. Pang and D. Li, An approximate quasi-Newton bundle-type method for nonsmooth optimization,
Abstract and Applied Analysis, 2013, Art. ID 697474, 7 pp.
doi: 10.1155/2013/697474. |
[29] |
W. Y. Sun and Y. X. Yuan,
Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. |
[30] |
G. L. Yuan, Z. H. Meng and Y. Li,
A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152.
|
[31] |
G. L. Yuan, Z. Sheng and W. J. Liu, The modified HZ conjugate gradient algorithm for large scale nonsmooth optimization Plos One, 11(2016), e0164289, 15pp.
doi: 10.1371/journal.pone.0164289. |
[32] |
G. L. Yuan and Z. X. Wei,
The Barzilai and Borwein gradient method with nonmonotone line search for nonsmooth convex optimization problems, Mathematical Modelling and Analysis, 17 (2012), 203-216.
|
[33] |
G. L. Yuan, Z. X. Wei and G. Y. Li,
A modified Polak-Ribiére-Polyak conjugate gradient algorithm for nonsmooth convex programs, Journal of Computational and Applied mathematics, 255 (2014), 86-96.
|
[34] |
G. L. Yuan, Z. X. Wei and Z. X. Wang,
Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minization, Computational Optimization and Applications, 54 (2013), 45-64.
|
[35] |
H. C. Zhang and W. W. Hager,
A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056.
doi: 10.1137/S1052623403428208. |
[36] |
L. Zhang, W. J. Zhou and D. H. Li,
A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006), 629-640.
doi: 10.1093/imanum/drl016. |
show all references
References:
[1] |
N. Andrei,
Scaled conjugate gradient algorithms for unconstrained optimization, Computational Optimization and Applications, 38 (2007), 401-416.
doi: 10.1007/s10589-007-9055-7. |
[2] |
A. Auslender,
Numerical methods for nondifferentiable convex optimization, Mathematical Programming Study, 30 (1987), 102-126.
|
[3] |
S. Babaie-Kafaki,
A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization, 4OR, A Quarterly Journal of Operations Research, 11 (2013), 361-374.
doi: 10.1007/s10288-013-0233-4. |
[4] |
S. Babaie-Kafaki and R. Chanbari,
A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update, Journal of Industrial and Management Optimization, 13 (2017), 649-658.
doi: 10.3934/jimo.2016038. |
[5] |
J. Barzilai and J. M. Borwein,
Two-point stepsize gradient methods, IMA Journal of Numerical Analysis, 8 (1988), 141-148.
|
[6] |
E. Birgin and J. M. Martínez,
A spectral conjugate gradient method for unconstrained optimization, Applied Mathematics and Optimization, 43 (2001), 117-128.
doi: 10.1007/s00245-001-0003-0. |
[7] |
J. F. Bonnans, J. C. Gilbert, C. Lemarechal and C. Sagastizabal,
A family of variable-metric proximal methods, Mathematical Programming, 68 (1995), 15-47.
doi: 10.1007/BF01585756. |
[8] |
J. V. Burke and M. Qian,
On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating, Mathematical Programming, 88 (2000), 157-181.
doi: 10.1007/PL00011373. |
[9] |
X. Chen and M. Fukushima,
Proximal quasi-Newton methods for nondifferentiable convex optimization, Mathematical Programming, 85 (1999), 313-334.
doi: 10.1007/s101070050059. |
[10] |
E. D. Dolan and J. J. Moré,
Benchmarking optimization software with performance profiles, Mathematical Programming, Serial A, 91 (2002), 201-213.
doi: 10.1007/s101070100263. |
[11] |
M. Fukushima,
A descent algorithm for nonsmooth convex optimization, Mathematical Programming, 30 (1984), 163-175.
doi: 10.1007/BF02591883. |
[12] |
M. Fukushima and L. Q. Qi,
A globally and superlinearly convergent algorithm for nonsmooth convex minimization, SIAM Journal on Optimization, 6 (1996), 1106-1120.
doi: 10.1137/S1052623494278839. |
[13] |
M. Haarala, K. Miettinen and M. M. Mäkelä,
New limited memory bundle method for large-scale nonsmooth optimization, Optimization Methods and Software, 19 (2004), 673-692.
doi: 10.1080/10556780410001689225. |
[14] |
M. Haarala, K. Miettinen and M. M. Mäkelä,
Globally convergent limited memory bundle method for large-scale nonsmooth optimization, Mathematical Programming, 109 (2007), 181-205.
doi: 10.1007/s10107-006-0728-2. |
[15] |
W. W. Hager and H. C. Zhang,
A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2 (2006), 35-58.
|
[16] |
J. B. Hiriart-Urruty and C. Lemaréchal, Convex Analysis and Minimization Algorithms, Springer, Berlin, 1993.
doi: 10.1007/978-3-662-02796-7. |
[17] |
C. Lemarechal and C. Sagastizabal,
Practical aspects of the Moreau-Yosida regularization, I: Theoretical preliminaries, SIAM Journal on Optimization, 7 (1997), 367-385.
|
[18] |
Q. Li,
Conjugate gradient type methods for the nondifferentiable convex minimization, Optimization Letters, 7 (2013), 533-545.
|
[19] |
D. H. Li and M. Fukushima,
A derivative-free line search and global convergence of Broyden-like method for nonlinear equations, Optimization Methods and Software, 13 (2000), 181-201.
|
[20] |
S. Lu, Z. X. Wei and L. Li,
A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization, Computational Optimization and Applications, 51 (2012), 551-573.
|
[21] |
L. Luk$\check{s}$an and J. Vl$\check{c}$ek, Test Problems for Nonsmooth Unconstrained and Linearly Constrained Optimization, Technical Report No. 798, Institute of Computer Science, Academy of Sciences of the Czech Republic, 2000. |
[22] |
R. Mifflin,
A quasi-second-order proximal bundle algorithm, Mathematical Programming, 73 (1996), 51-72.
|
[23] |
Y. G. Ou and H. C. Lin,
An ODE-like nonmonotone method for nonsmooth convex optimization, Journal of Applied Mathematics and Computing, 52 (2016), 265-285.
|
[24] |
L. Q. Qi,
Convergence analysis of some algorithms for solving nonsmooth equations, Mathematics of Operations Research, 18 (1993), 227-244.
doi: 10.1287/moor.18.1.227. |
[25] |
A. I. Rauf and M. Fukushima,
A globally convergent BFGS method for nonsmooth convex optimization, Journal of Optimization Theory and Applications, 104 (2000), 539-558.
|
[26] |
N. Sagara and M. Fukushima,
A trust region method for nonsmooth convex optimization, Journal of Industrial and Management Optimization, 1 (2005), 171-180.
|
[27] |
D. F. Shanno,
On the convergence of a new conjugate gradient algorithm, SIAM Journal on Numerical Analysis, 15 (1978), 1247-1257.
|
[28] |
J. Shen, L. P. Pang and D. Li, An approximate quasi-Newton bundle-type method for nonsmooth optimization,
Abstract and Applied Analysis, 2013, Art. ID 697474, 7 pp.
doi: 10.1155/2013/697474. |
[29] |
W. Y. Sun and Y. X. Yuan,
Optimization Theory and Methods: Nonlinear Programming, Springer, New York, 2006. |
[30] |
G. L. Yuan, Z. H. Meng and Y. Li,
A modified Hestenes and Stiefel conjugate gradient algorithm for large scale nonsmooth minimizations and nonlinear equations, Journal of Optimization Theory and Applications, 168 (2016), 129-152.
|
[31] |
G. L. Yuan, Z. Sheng and W. J. Liu, The modified HZ conjugate gradient algorithm for large scale nonsmooth optimization Plos One, 11(2016), e0164289, 15pp.
doi: 10.1371/journal.pone.0164289. |
[32] |
G. L. Yuan and Z. X. Wei,
The Barzilai and Borwein gradient method with nonmonotone line search for nonsmooth convex optimization problems, Mathematical Modelling and Analysis, 17 (2012), 203-216.
|
[33] |
G. L. Yuan, Z. X. Wei and G. Y. Li,
A modified Polak-Ribiére-Polyak conjugate gradient algorithm for nonsmooth convex programs, Journal of Computational and Applied mathematics, 255 (2014), 86-96.
|
[34] |
G. L. Yuan, Z. X. Wei and Z. X. Wang,
Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minization, Computational Optimization and Applications, 54 (2013), 45-64.
|
[35] |
H. C. Zhang and W. W. Hager,
A nonmonotone line search technique and its application to unconstrained optimization, SIAM Jpournal on Optimization, 14 (2004), 1043-1056.
doi: 10.1137/S1052623403428208. |
[36] |
L. Zhang, W. J. Zhou and D. H. Li,
A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006), 629-640.
doi: 10.1093/imanum/drl016. |


No. | Functions | |||
1 | Rosenbrock | 2 | (-1.2; 1) | 0 |
2 | Crescent | 2 | (-1.5; 2) | 0 |
3 | CB2 | 2 | (1; -0.1) | 1.9522245 |
4 | CB3 | 2 | (2; 2) | 2.0 |
5 | DEM | 2 | (1; 1) | -3 |
6 | QL | 2 | (-1; 5) | 7.2 |
7 | LQ | 2 | (-0.5; -0.5) | -1.4142136 |
8 | Mifflin 1 | 2 | (0.8; 0.6) | -1.0 |
9 | Mifflin 2 | 2 | (-1; -1) | -1.0 |
10 | Wolfe | 2 | (3; 2) | -8 |
11 | Rosen-Suzuki | 4 | (0; 0; 0; 0) | -44 |
12 | Shor | 5 | (0; 0; 0; 0; 1) | 22.600162 |
No. | Functions | |||
1 | Rosenbrock | 2 | (-1.2; 1) | 0 |
2 | Crescent | 2 | (-1.5; 2) | 0 |
3 | CB2 | 2 | (1; -0.1) | 1.9522245 |
4 | CB3 | 2 | (2; 2) | 2.0 |
5 | DEM | 2 | (1; 1) | -3 |
6 | QL | 2 | (-1; 5) | 7.2 |
7 | LQ | 2 | (-0.5; -0.5) | -1.4142136 |
8 | Mifflin 1 | 2 | (0.8; 0.6) | -1.0 |
9 | Mifflin 2 | 2 | (-1; -1) | -1.0 |
10 | Wolfe | 2 | (3; 2) | -8 |
11 | Rosen-Suzuki | 4 | (0; 0; 0; 0) | -44 |
12 | Shor | 5 | (0; 0; 0; 0; 1) | 22.600162 |
No. | Algorithn 3.1 | LWTR | YWBB | SFTR | RFBFGS |
1 | 3/4/ | 6/13/ | 54/56/ | 48/49/ | 4/5/ |
2.6178e-9 | 0.2976e-7 | 3.4484e-7 | 7.1545e-4 | 6.2072e-10 | |
2 | 3/4/ | 3/7/ | 14/16/ | 31/32/ | 35/36/ |
3.3026e-6 | 6.5430e-4 | 2.7450e-5 | 1.6000e-3 | 3.0590e-7 | |
3 | 4/5/ | 5/12/ | 13/15/ | 54/55/ | 5/6/ |
1.9522 | 1.9522 | 1.9522 | 1.9573 | 1.9522 | |
4 | 2/3/ | 6/13/ | 4/8/ | 55/56 | 2/3/ |
2.0000 | 2.0000 | 2.0000 | 2.0100 | 2.0076 | |
5 | 3/4/ | 8/16/ | 4/7/ | 5/6/ | 3/4 |
-3.0000 | -3.0000 | -3.0000 | -3.0000 | -3.0000 | |
6 | 11/12/ | 4/9/ | 22/25/ | 48/49/ | 10/11/ |
7.2000 | 7.2000 | 7.2000 | 7.2003 | 7.2000 | |
7 | 3/4/ | 5/10/ | 6/7/ | 3/4/ | 2/3/ |
-1.4142 | -1.4142 | -1.4142 | -1.4118 | -1.4033 | |
8 | 34/35/ | 15/31/ | 3/6/ | 59/60/ | 57/58/ |
-1.0000 | -1.0000 | -0.9938 | -1.0000 | -1.0000 | |
9 | 2/3/ | 8/16/ | 12/13/ | 4/5/ | 2/3/ |
-1.0000 | -1.0000 | -0.9999 | -0.9997 | -0.9813 | |
10 | 3/4/ | 12/24/ | 9/12/ | 43/46/ | 4/5/ |
-8.0000 | -8.0000 | -8.0000 | -8.0000 | -8.0000 | |
11 | 49/106/ | 20/40/ | 8/9/ | 60/61/ | 25/31/ |
-43.9999 | -44.0000 | -43.9493 | -39.9924 | -43.9982 | |
12 | 14/16/ | 14/28/ | 9/10/ | 71/72/ | 66/152/ |
22.6019 | 22.6002 | 22.6004 | 22.6892 | 22.6017 |
No. | Algorithn 3.1 | LWTR | YWBB | SFTR | RFBFGS |
1 | 3/4/ | 6/13/ | 54/56/ | 48/49/ | 4/5/ |
2.6178e-9 | 0.2976e-7 | 3.4484e-7 | 7.1545e-4 | 6.2072e-10 | |
2 | 3/4/ | 3/7/ | 14/16/ | 31/32/ | 35/36/ |
3.3026e-6 | 6.5430e-4 | 2.7450e-5 | 1.6000e-3 | 3.0590e-7 | |
3 | 4/5/ | 5/12/ | 13/15/ | 54/55/ | 5/6/ |
1.9522 | 1.9522 | 1.9522 | 1.9573 | 1.9522 | |
4 | 2/3/ | 6/13/ | 4/8/ | 55/56 | 2/3/ |
2.0000 | 2.0000 | 2.0000 | 2.0100 | 2.0076 | |
5 | 3/4/ | 8/16/ | 4/7/ | 5/6/ | 3/4 |
-3.0000 | -3.0000 | -3.0000 | -3.0000 | -3.0000 | |
6 | 11/12/ | 4/9/ | 22/25/ | 48/49/ | 10/11/ |
7.2000 | 7.2000 | 7.2000 | 7.2003 | 7.2000 | |
7 | 3/4/ | 5/10/ | 6/7/ | 3/4/ | 2/3/ |
-1.4142 | -1.4142 | -1.4142 | -1.4118 | -1.4033 | |
8 | 34/35/ | 15/31/ | 3/6/ | 59/60/ | 57/58/ |
-1.0000 | -1.0000 | -0.9938 | -1.0000 | -1.0000 | |
9 | 2/3/ | 8/16/ | 12/13/ | 4/5/ | 2/3/ |
-1.0000 | -1.0000 | -0.9999 | -0.9997 | -0.9813 | |
10 | 3/4/ | 12/24/ | 9/12/ | 43/46/ | 4/5/ |
-8.0000 | -8.0000 | -8.0000 | -8.0000 | -8.0000 | |
11 | 49/106/ | 20/40/ | 8/9/ | 60/61/ | 25/31/ |
-43.9999 | -44.0000 | -43.9493 | -39.9924 | -43.9982 | |
12 | 14/16/ | 14/28/ | 9/10/ | 71/72/ | 66/152/ |
22.6019 | 22.6002 | 22.6004 | 22.6892 | 22.6017 |
No. | Functions | Initial points |
1 | Generalization of MAXQ | |
2 | Generalization of MXHILB | |
3 | Chained LQ | |
4 | Number of active faces | |
5 | Nonsmooth generalization of Brown 2 | |
6 | Chained Mifflin 2 | |
7 | Chained Crescent Ⅰ | |
8 | Chained Crescent Ⅱ | |
No. | Functions | Initial points |
1 | Generalization of MAXQ | |
2 | Generalization of MXHILB | |
3 | Chained LQ | |
4 | Number of active faces | |
5 | Nonsmooth generalization of Brown 2 | |
6 | Chained Mifflin 2 | |
7 | Chained Crescent Ⅰ | |
8 | Chained Crescent Ⅱ | |
No. | n | Algorithn 3.1 | CG-YWL |
1 | 1000 | 186/1601/2.6568e-10 | 225/4710/6.9354e-8 |
5000 | 242/2725/1.2183e-10 | 250/5235/6.8798e-8 | |
10000 | 253/2997/3.4045e-10 | 261/5466/6.6528e-8 | |
2 | 1000 | 94/1301/4.0582e-9 | 91/1482/8.2738e-9 |
5000 | 119/1499/2.6731e-9 | 111/1938/9.7206e-9 | |
10000 | 129/1901/2.1905e-9 | 120/2127/5.8524e-9 | |
3 | 1000 | 37/110/2.3278e-9 | 37/114/7.2687e-9 |
5000 | 39/116/5.9987e-9 | 39/120/9.0932e-9 | |
10000 | 40/121/1.6943e-9 | 40/123/9.0941e-9 | |
4 | 1000 | 71/891/3.6735e-11 | 77/1026/6.8037e-9 |
5000 | 82/937/7.9864e-11 | 90/1281/7.8405e-9 | |
10000 | 86/1208/5.7163e-11 | 96/1401/9.9366e-9 | |
5 | 1000 | 35/114/1.5021e-11 | 38/117/7.2687e-9 |
5000 | 39/120/5.2352e-11 | 40/123/9.0932e-9 | |
10000 | 41/126/2.1136e-11 | 41/125/1.8188e-8 | |
6 | 1000 | 38/116/-5.3979e+3 | 37/114/-2.4975e+4 |
5000 | 41/123/-3.2153e+4 | 39/120/-1.2498e+5 | |
10000 | 43/129/-2.0127e+4 | 40/123/-2.4998e+5 | |
7 | 1000 | 34/101/7.0463e-11 | 37/114/5.4897e-9 |
5000 | 36/113/3.8145e-11 | 39/120/6.8294e-9 | |
10000 | 39/120/4.3654e-11 | 40/123/6.8253e-9 | |
8 | 1000 | 37/112/6.0424e-11 | 39/120/6.8185e-9 |
5000 | 39/121/1.8205e-11 | 41/126/8.5258e-9 | |
10000 | 42/125/2.6473e-11 | 42/129/8.5262e-9 |
No. | n | Algorithn 3.1 | CG-YWL |
1 | 1000 | 186/1601/2.6568e-10 | 225/4710/6.9354e-8 |
5000 | 242/2725/1.2183e-10 | 250/5235/6.8798e-8 | |
10000 | 253/2997/3.4045e-10 | 261/5466/6.6528e-8 | |
2 | 1000 | 94/1301/4.0582e-9 | 91/1482/8.2738e-9 |
5000 | 119/1499/2.6731e-9 | 111/1938/9.7206e-9 | |
10000 | 129/1901/2.1905e-9 | 120/2127/5.8524e-9 | |
3 | 1000 | 37/110/2.3278e-9 | 37/114/7.2687e-9 |
5000 | 39/116/5.9987e-9 | 39/120/9.0932e-9 | |
10000 | 40/121/1.6943e-9 | 40/123/9.0941e-9 | |
4 | 1000 | 71/891/3.6735e-11 | 77/1026/6.8037e-9 |
5000 | 82/937/7.9864e-11 | 90/1281/7.8405e-9 | |
10000 | 86/1208/5.7163e-11 | 96/1401/9.9366e-9 | |
5 | 1000 | 35/114/1.5021e-11 | 38/117/7.2687e-9 |
5000 | 39/120/5.2352e-11 | 40/123/9.0932e-9 | |
10000 | 41/126/2.1136e-11 | 41/125/1.8188e-8 | |
6 | 1000 | 38/116/-5.3979e+3 | 37/114/-2.4975e+4 |
5000 | 41/123/-3.2153e+4 | 39/120/-1.2498e+5 | |
10000 | 43/129/-2.0127e+4 | 40/123/-2.4998e+5 | |
7 | 1000 | 34/101/7.0463e-11 | 37/114/5.4897e-9 |
5000 | 36/113/3.8145e-11 | 39/120/6.8294e-9 | |
10000 | 39/120/4.3654e-11 | 40/123/6.8253e-9 | |
8 | 1000 | 37/112/6.0424e-11 | 39/120/6.8185e-9 |
5000 | 39/121/1.8205e-11 | 41/126/8.5258e-9 | |
10000 | 42/125/2.6473e-11 | 42/129/8.5262e-9 |
[1] |
Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial and Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038 |
[2] |
Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial and Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034 |
[3] |
Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial and Management Optimization, 2020, 16 (1) : 245-260. doi: 10.3934/jimo.2018149 |
[4] |
Yigui Ou, Yuanwen Liu. A memory gradient method based on the nonmonotone technique. Journal of Industrial and Management Optimization, 2017, 13 (2) : 857-872. doi: 10.3934/jimo.2016050 |
[5] |
Hong Seng Sim, Chuei Yee Chen, Wah June Leong, Jiao Li. Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization. Journal of Industrial and Management Optimization, 2021 doi: 10.3934/jimo.2021143 |
[6] |
Nobuko Sagara, Masao Fukushima. trust region method for nonsmooth convex optimization. Journal of Industrial and Management Optimization, 2005, 1 (2) : 171-180. doi: 10.3934/jimo.2005.1.171 |
[7] |
C.Y. Wang, M.X. Li. Convergence property of the Fletcher-Reeves conjugate gradient method with errors. Journal of Industrial and Management Optimization, 2005, 1 (2) : 193-200. doi: 10.3934/jimo.2005.1.193 |
[8] |
El-Sayed M.E. Mostafa. A nonlinear conjugate gradient method for a special class of matrix optimization problems. Journal of Industrial and Management Optimization, 2014, 10 (3) : 883-903. doi: 10.3934/jimo.2014.10.883 |
[9] |
Jueyou Li, Guoquan Li, Zhiyou Wu, Changzhi Wu, Xiangyu Wang, Jae-Myung Lee, Kwang-Hyo Jung. Incremental gradient-free method for nonsmooth distributed optimization. Journal of Industrial and Management Optimization, 2017, 13 (4) : 1841-1857. doi: 10.3934/jimo.2017021 |
[10] |
Dan Li, Li-Ping Pang, Fang-Fang Guo, Zun-Quan Xia. An alternating linearization method with inexact data for bilevel nonsmooth convex optimization. Journal of Industrial and Management Optimization, 2014, 10 (3) : 859-869. doi: 10.3934/jimo.2014.10.859 |
[11] |
Sanming Liu, Zhijie Wang, Chongyang Liu. On convergence analysis of dual proximal-gradient methods with approximate gradient for a class of nonsmooth convex minimization problems. Journal of Industrial and Management Optimization, 2016, 12 (1) : 389-402. doi: 10.3934/jimo.2016.12.389 |
[12] |
Jianjun Zhang, Yunyi Hu, James G. Nagy. A scaled gradient method for digital tomographic image reconstruction. Inverse Problems and Imaging, 2018, 12 (1) : 239-259. doi: 10.3934/ipi.2018010 |
[13] |
Rouhollah Tavakoli, Hongchao Zhang. A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numerical Algebra, Control and Optimization, 2012, 2 (2) : 395-412. doi: 10.3934/naco.2012.2.395 |
[14] |
Jin-Zan Liu, Xin-Wei Liu. A dual Bregman proximal gradient method for relatively-strongly convex optimization. Numerical Algebra, Control and Optimization, 2021 doi: 10.3934/naco.2021028 |
[15] |
Wanbin Tong, Hongjin He, Chen Ling, Liqun Qi. A nonmonotone spectral projected gradient method for tensor eigenvalue complementarity problems. Numerical Algebra, Control and Optimization, 2020, 10 (4) : 425-437. doi: 10.3934/naco.2020042 |
[16] |
Lijuan Zhao, Wenyu Sun. Nonmonotone retrospective conic trust region method for unconstrained optimization. Numerical Algebra, Control and Optimization, 2013, 3 (2) : 309-325. doi: 10.3934/naco.2013.3.309 |
[17] |
Nam-Yong Lee, Bradley J. Lucier. Preconditioned conjugate gradient method for boundary artifact-free image deblurring. Inverse Problems and Imaging, 2016, 10 (1) : 195-225. doi: 10.3934/ipi.2016.10.195 |
[18] |
Xing Li, Chungen Shen, Lei-Hong Zhang. A projected preconditioned conjugate gradient method for the linear response eigenvalue problem. Numerical Algebra, Control and Optimization, 2018, 8 (4) : 389-412. doi: 10.3934/naco.2018025 |
[19] |
ShiChun Lv, Shou-Qiang Du. A new smoothing spectral conjugate gradient method for solving tensor complementarity problems. Journal of Industrial and Management Optimization, 2021 doi: 10.3934/jimo.2021150 |
[20] |
Stefan Kindermann. Convergence of the gradient method for ill-posed problems. Inverse Problems and Imaging, 2017, 11 (4) : 703-720. doi: 10.3934/ipi.2017033 |
2020 Impact Factor: 1.801
Tools
Metrics
Other articles
by authors
[Back to Top]