April  2017, 13(2): 595-608. doi: 10.3934/jimo.2016034

A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems

1. 

Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu Province, 211106, China

2. 

Huaibei Normal University, Huaibei, Anhui Province, 235000, China

3. 

Hubei Engineering University, Xiaogan, Hubei Pvovince, 432000, China

Received  January 2015 Revised  December 2015 Published  May 2016

Fund Project: The authors are supported by the Natural Science Foundation of China (11471159,11571169), the Natural Science Foundation of Jiangsu Province (BK20141409), the Education Department Foundation of Anhui Province(KJ2016A651,2014jyxm161), and the Science and Technology Foundation of the Department of Education of Hubei Province (D20152701).

In this paper, a scaled method that combines the conjugate gradient with moving asymptotes is presented for solving the large-scaled nonlinear unconstrained optimization problem. A diagonal matrix is obtained by the moving asymptote technique, and a scaled gradient is determined by multiplying the gradient with the diagonal matrix. The search direction is either a scaled conjugate gradient direction or a negative scaled gradient direction under different conditions. This direction is sufficient descent if the step size satisfies the strong Wolfe condition. A global convergence analysis of this method is also provided. The numerical results show that the scaled method is efficient for solving some large-scaled nonlinear problems.

Citation: Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034
References:
[1]

M. Al-BaaliY. Narushima and H. Yabe, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, Computational Optimization and Applications, 60 (2015), 89-110.  doi: 10.1007/s10589-014-9662-z.  Google Scholar

[2]

N. Andrei, An unconstrained optimization test functions collection, Advanced Modeling and Optimization, 10 (2008), 147-161.   Google Scholar

[3]

Y. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM Journal on Optimization, 10 (1999), 177-182.  doi: 10.1137/S1052623497318992.  Google Scholar

[4]

Y. Dai and C. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM Journal on Optimization, 23 (2013), 296-320.  doi: 10.1137/100813026.  Google Scholar

[5]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Mathematical Programming, 91 (2002), 201-213.  doi: 10.1007/s101070100263.  Google Scholar

[6]

R. Fletcher and C. M. Reeves, Function minimization by conjugate gradients, The Computer Journal, 7 (1964), 149-154.  doi: 10.1093/comjnl/7.2.149.  Google Scholar

[7]

N. I. M. GouldD. Orban and P. L. Toint, CUTEr and SifDec: A constrained and unconstrained testing environment, revisited, ACM Transactions on Mathematical Software, 29 (2003), 353-372.  doi: 10.1145/962437.962438.  Google Scholar

[8]

W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192.  doi: 10.1137/030601880.  Google Scholar

[9]

W. Hager and H. Zhang, The limited memory conjugate gradient method, SIAM Journal on Optimization, 23 (2013), 2150-2168.  doi: 10.1137/120898097.  Google Scholar

[10]

M. R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards, 49 (1952), 409-436.  doi: 10.6028/jres.049.044.  Google Scholar

[11]

D. Luenberger and Y. Ye, Linear and Nonlinear Programming, 3rd edition, Springer-Verlag, New York, 2008.  Google Scholar

[12]

W. NakamuraY. Narushima and H. Yabe, Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization, Journal of Industrial and Management Optimization, 9 (2013), 595-619.  doi: 10.3934/jimo.2013.9.595.  Google Scholar

[13]

Y. NarushimaH. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21 (2011), 212-230.  doi: 10.1137/080743573.  Google Scholar

[14]

Q. Ni, A globally convergent method of moving asymptotes with trust region technique, Optimization methods and software, 18 (2003), 283-297.  doi: 10.1080/1055678031000118491.  Google Scholar

[15]

E. Polak and G. Ribiere, Note sur la convergence de méthodes de directions conjuguées, Revue française d'informatique et de recherche opérationnelle, série rouge, 3 (1969), 35-43.   Google Scholar

[16]

B. T. Polyak, The conjugate gradient method in extremal problems, USSR Computational Mathematics and Mathematical Physics, 9 (1969), 94-112.  doi: 10.1016/0041-5553(69)90035-4.  Google Scholar

[17]

K. Svanberg, The method of moving asymptotes—a new method for structural optimization, International Journal for Numerical Methods in Engineering, 24 (2987), 359-373.  doi: 10.1002/nme.1620240207.  Google Scholar

[18]

H. Wang and Q. Ni, A new method of moving asymptotes for large-scale unconstrained optimization, Applied Mathematics and Computaiton, 203 (2008), 62-71.  doi: 10.1016/j.amc.2008.03.035.  Google Scholar

[19]

W. Zhou and Y. Zhou, On the strong convergence of a modified Hestenes-Stiefel method for nonconvex optimization, Journal of Industrial and Management Optimization, 9 (2013), 893-899.  doi: 10.3934/jimo.2013.9.893.  Google Scholar

show all references

References:
[1]

M. Al-BaaliY. Narushima and H. Yabe, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, Computational Optimization and Applications, 60 (2015), 89-110.  doi: 10.1007/s10589-014-9662-z.  Google Scholar

[2]

N. Andrei, An unconstrained optimization test functions collection, Advanced Modeling and Optimization, 10 (2008), 147-161.   Google Scholar

[3]

Y. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM Journal on Optimization, 10 (1999), 177-182.  doi: 10.1137/S1052623497318992.  Google Scholar

[4]

Y. Dai and C. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM Journal on Optimization, 23 (2013), 296-320.  doi: 10.1137/100813026.  Google Scholar

[5]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Mathematical Programming, 91 (2002), 201-213.  doi: 10.1007/s101070100263.  Google Scholar

[6]

R. Fletcher and C. M. Reeves, Function minimization by conjugate gradients, The Computer Journal, 7 (1964), 149-154.  doi: 10.1093/comjnl/7.2.149.  Google Scholar

[7]

N. I. M. GouldD. Orban and P. L. Toint, CUTEr and SifDec: A constrained and unconstrained testing environment, revisited, ACM Transactions on Mathematical Software, 29 (2003), 353-372.  doi: 10.1145/962437.962438.  Google Scholar

[8]

W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192.  doi: 10.1137/030601880.  Google Scholar

[9]

W. Hager and H. Zhang, The limited memory conjugate gradient method, SIAM Journal on Optimization, 23 (2013), 2150-2168.  doi: 10.1137/120898097.  Google Scholar

[10]

M. R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards, 49 (1952), 409-436.  doi: 10.6028/jres.049.044.  Google Scholar

[11]

D. Luenberger and Y. Ye, Linear and Nonlinear Programming, 3rd edition, Springer-Verlag, New York, 2008.  Google Scholar

[12]

W. NakamuraY. Narushima and H. Yabe, Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization, Journal of Industrial and Management Optimization, 9 (2013), 595-619.  doi: 10.3934/jimo.2013.9.595.  Google Scholar

[13]

Y. NarushimaH. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21 (2011), 212-230.  doi: 10.1137/080743573.  Google Scholar

[14]

Q. Ni, A globally convergent method of moving asymptotes with trust region technique, Optimization methods and software, 18 (2003), 283-297.  doi: 10.1080/1055678031000118491.  Google Scholar

[15]

E. Polak and G. Ribiere, Note sur la convergence de méthodes de directions conjuguées, Revue française d'informatique et de recherche opérationnelle, série rouge, 3 (1969), 35-43.   Google Scholar

[16]

B. T. Polyak, The conjugate gradient method in extremal problems, USSR Computational Mathematics and Mathematical Physics, 9 (1969), 94-112.  doi: 10.1016/0041-5553(69)90035-4.  Google Scholar

[17]

K. Svanberg, The method of moving asymptotes—a new method for structural optimization, International Journal for Numerical Methods in Engineering, 24 (2987), 359-373.  doi: 10.1002/nme.1620240207.  Google Scholar

[18]

H. Wang and Q. Ni, A new method of moving asymptotes for large-scale unconstrained optimization, Applied Mathematics and Computaiton, 203 (2008), 62-71.  doi: 10.1016/j.amc.2008.03.035.  Google Scholar

[19]

W. Zhou and Y. Zhou, On the strong convergence of a modified Hestenes-Stiefel method for nonconvex optimization, Journal of Industrial and Management Optimization, 9 (2013), 893-899.  doi: 10.3934/jimo.2013.9.893.  Google Scholar

Figure 1.  CPU time performance profiles in a log2 scale (n = 102)
Figure 2.  Iterations performance profiles in a log2 scale (n = 102)
Figure 3.  CPU time performance profiles in a log2 scale (n = 103)
Figure 4.  Iterations performance profiles in a log2 scale (n = 103)
Figure 5.  CPU time performance profiles in a log2 scale (n = 104)
Figure 6.  Iterations performance profiles in a log2 scale (n = 104)
Table 1.  Test functions
No. Function Name No. Function Name
1 ARWHEAD 17 Ext quadratic penalty QP2
2 COSINE 18 Ext quadratic exponential EP1
3 EDENSCH 19 Ext Tridiagonal 2
4 EG2 20 Ext DENSCHNF
5 ENGVAL1 21 HIMMELBG
6 GENROSE 22 HIMMELH
7 Ext Freudenstein & Roth 23 TOINTGSS
8 Raydan 2 24 Extended DENSCHNB
9 Ext Tridiagonal 1 25 LIARWHD
10 Ext TET 26 Extended Trigonometric
11 Diagonal 5 27 Extended Penalty
12 Diagonal 2 28 Extended BD1
13 Ext Maratos 29 Perturbed Quadratic
14 Ext Cliff 30 Raydan 1
15 Perturbed quadratic diagonal 31 Diagonal 4
16 Ext quadratic penalty QP1 32 QUARTC
No. Function Name No. Function Name
1 ARWHEAD 17 Ext quadratic penalty QP2
2 COSINE 18 Ext quadratic exponential EP1
3 EDENSCH 19 Ext Tridiagonal 2
4 EG2 20 Ext DENSCHNF
5 ENGVAL1 21 HIMMELBG
6 GENROSE 22 HIMMELH
7 Ext Freudenstein & Roth 23 TOINTGSS
8 Raydan 2 24 Extended DENSCHNB
9 Ext Tridiagonal 1 25 LIARWHD
10 Ext TET 26 Extended Trigonometric
11 Diagonal 5 27 Extended Penalty
12 Diagonal 2 28 Extended BD1
13 Ext Maratos 29 Perturbed Quadratic
14 Ext Cliff 30 Raydan 1
15 Perturbed quadratic diagonal 31 Diagonal 4
16 Ext quadratic penalty QP1 32 QUARTC
[1]

Manxue You, Shengjie Li. Perturbation of Image and conjugate duality for vector optimization. Journal of Industrial & Management Optimization, 2020  doi: 10.3934/jimo.2020176

[2]

Predrag S. Stanimirović, Branislav Ivanov, Haifeng Ma, Dijana Mosić. A survey of gradient methods for solving nonlinear optimization. Electronic Research Archive, 2020, 28 (4) : 1573-1624. doi: 10.3934/era.2020115

[3]

M. S. Lee, H. G. Harno, B. S. Goh, K. H. Lim. On the bang-bang control approach via a component-wise line search strategy for unconstrained optimization. Numerical Algebra, Control & Optimization, 2021, 11 (1) : 45-61. doi: 10.3934/naco.2020014

[4]

Yuxia Guo, Shaolong Peng. A direct method of moving planes for fully nonlinear nonlocal operators and applications. Discrete & Continuous Dynamical Systems - S, 2020  doi: 10.3934/dcdss.2020462

[5]

Illés Horváth, Kristóf Attila Horváth, Péter Kovács, Miklós Telek. Mean-field analysis of a scaling MAC radio protocol. Journal of Industrial & Management Optimization, 2021, 17 (1) : 279-297. doi: 10.3934/jimo.2019111

[6]

Martin Heida, Stefan Neukamm, Mario Varga. Stochastic homogenization of $ \Lambda $-convex gradient flows. Discrete & Continuous Dynamical Systems - S, 2021, 14 (1) : 427-453. doi: 10.3934/dcdss.2020328

[7]

Gabrielle Nornberg, Delia Schiera, Boyan Sirakov. A priori estimates and multiplicity for systems of elliptic PDE with natural gradient growth. Discrete & Continuous Dynamical Systems - A, 2020, 40 (6) : 3857-3881. doi: 10.3934/dcds.2020128

[8]

Juhua Shi, Feida Jiang. The degenerate Monge-Ampère equations with the Neumann condition. Communications on Pure & Applied Analysis, , () : -. doi: 10.3934/cpaa.2020297

[9]

Min Xi, Wenyu Sun, Jun Chen. Survey of derivative-free optimization. Numerical Algebra, Control & Optimization, 2020, 10 (4) : 537-555. doi: 10.3934/naco.2020050

[10]

Thomas Frenzel, Matthias Liero. Effective diffusion in thin structures via generalized gradient systems and EDP-convergence. Discrete & Continuous Dynamical Systems - S, 2021, 14 (1) : 395-425. doi: 10.3934/dcdss.2020345

[11]

Elena Nozdrinova, Olga Pochinka. Solution of the 33rd Palis-Pugh problem for gradient-like diffeomorphisms of a two-dimensional sphere. Discrete & Continuous Dynamical Systems - A, 2021, 41 (3) : 1101-1131. doi: 10.3934/dcds.2020311

[12]

Peter Frolkovič, Karol Mikula, Jooyoung Hahn, Dirk Martin, Branislav Basara. Flux balanced approximation with least-squares gradient for diffusion equation on polyhedral mesh. Discrete & Continuous Dynamical Systems - S, 2021, 14 (3) : 865-879. doi: 10.3934/dcdss.2020350

[13]

Shenglan Xie, Maoan Han, Peng Zhu. A posteriori error estimate of weak Galerkin fem for second order elliptic problem with mixed boundary condition. Discrete & Continuous Dynamical Systems - B, 2020  doi: 10.3934/dcdsb.2020340

[14]

Yoshitsugu Kabeya. Eigenvalues of the Laplace-Beltrami operator under the homogeneous Neumann condition on a large zonal domain in the unit sphere. Discrete & Continuous Dynamical Systems - A, 2020, 40 (6) : 3529-3559. doi: 10.3934/dcds.2020040

[15]

Larissa Fardigola, Kateryna Khalina. Controllability problems for the heat equation on a half-axis with a bounded control in the Neumann boundary condition. Mathematical Control & Related Fields, 2021, 11 (1) : 211-236. doi: 10.3934/mcrf.2020034

[16]

Darko Dimitrov, Hosam Abdo. Tight independent set neighborhood union condition for fractional critical deleted graphs and ID deleted graphs. Discrete & Continuous Dynamical Systems - S, 2019, 12 (4&5) : 711-721. doi: 10.3934/dcdss.2019045

[17]

Md. Masum Murshed, Kouta Futai, Masato Kimura, Hirofumi Notsu. Theoretical and numerical studies for energy estimates of the shallow water equations with a transmission boundary condition. Discrete & Continuous Dynamical Systems - S, 2021, 14 (3) : 1063-1078. doi: 10.3934/dcdss.2020230

[18]

Tomáš Bodnár, Philippe Fraunié, Petr Knobloch, Hynek Řezníček. Numerical evaluation of artificial boundary condition for wall-bounded stably stratified flows. Discrete & Continuous Dynamical Systems - S, 2021, 14 (3) : 785-801. doi: 10.3934/dcdss.2020333

[19]

Xinpeng Wang, Bingo Wing-Kuen Ling, Wei-Chao Kuang, Zhijing Yang. Orthogonal intrinsic mode functions via optimization approach. Journal of Industrial & Management Optimization, 2021, 17 (1) : 51-66. doi: 10.3934/jimo.2019098

[20]

Wolfgang Riedl, Robert Baier, Matthias Gerdts. Optimization-based subdivision algorithm for reachable sets. Journal of Computational Dynamics, 2021, 8 (1) : 99-130. doi: 10.3934/jcd.2021005

2019 Impact Factor: 1.366

Metrics

  • PDF downloads (2403)
  • HTML views (385)
  • Cited by (0)

Other articles
by authors

[Back to Top]