• Previous Article
    Decision framework for location and selection of container multimodal hubs: A case in china under the belt and road initiative
  • JIMO Home
  • This Issue
  • Next Article
    An EOQ inventory model for deteriorating items with controllable deterioration rate under stock-dependent demand rate and non-linear holding cost
doi: 10.3934/jimo.2021143
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization

1. 

Centre for Mathematical Sciences, Universiti Tunku Abdul Rahman, 43000 Kajang, Malaysia

2. 

Department of Mathematics and Statistics, Faculty of Science, Universiti Putra Malaysia, 43400 UPM Serdang, Malaysia

3. 

Institute for Mathematical Research, Universiti Putra Malaysia, 43400 UPM Serdang, Malaysia

4. 

School of Mathematics and Statistics, Changsha University of Science and Technology, Changsha, Hunan 410114, China

* Corresponding author: cychen@upm.edu.my

Received  March 2021 Revised  May 2021 Early access August 2021

Fund Project: The first author was supported by UTAR Research Fund IPSR/RMC/UTARRF/2019-C1/S02

This paper proposes a nonmonotone spectral gradient method for solving large-scale unconstrained optimization problems. The spectral parameter is derived from the eigenvalues of an optimally sized memoryless symmetric rank-one matrix obtained under the measure defined as a ratio of the determinant of updating matrix over its largest eigenvalue. Coupled with a nonmonotone line search strategy where backtracking-type line search is applied selectively, the spectral parameter acts as a stepsize during iterations when no line search is performed and as a milder form of quasi-Newton update when backtracking line search is employed. Convergence properties of the proposed method are established for uniformly convex functions. Extensive numerical experiments are conducted and the results indicate that our proposed spectral gradient method outperforms some standard conjugate-gradient methods.

Citation: Hong Seng Sim, Chuei Yee Chen, Wah June Leong, Jiao Li. Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization. Journal of Industrial & Management Optimization, doi: 10.3934/jimo.2021143
References:
[1]

N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim., 10 (2008), 147-161.   Google Scholar

[2]

J. Barzilai and J. Borwein, Two-point step size gradient methods, IMA J. Numer. Anal., 8 (1988), 141-148.   Google Scholar

[3]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: Constrained and unconstrained testing environment, ACM Transactions on Mathematical Software, 21 (1995), 123-160.  doi: 10.1145/200979.201043.  Google Scholar

[4]

R. H. Byrd and J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained minimization, SIAM J. Numer. Anal., 26 (1989), 727-739.  doi: 10.1137/0726042.  Google Scholar

[5]

A. Cauchy, Methodes generales pour la resolution des syst'emes dequations simultanees, Comp. Rend. Sci. Paris, 25 (1847), 536-538.   Google Scholar

[6]

Y. H. Dai and Y. X. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., 10 (1999), 177-182.  doi: 10.1137/S1052623497318992.  Google Scholar

[7]

J. E. Dennis and H. Wolkowicz, Sizing and least change secant methods, SIAM J. Numer. Anal., 30 (1993), 1291-1314.  doi: 10.1137/0730067.  Google Scholar

[8]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.  doi: 10.1007/s101070100263.  Google Scholar

[9]

R. Fletcher, Practical Methods of Optimization: Unconstrained Optimization, 2$^{nd}$ edition, A Wiley-Interscience Publication. John Wiley & Sons, Ltd., Chichester, 1987.  Google Scholar

[10]

R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J., 7 (1964), 149-154.   Google Scholar

[11]

W. GluntT. L. Hayden and M. Raydan, Molecular conformations from distance matrices, Journal of Computational Chemistry, 14 (1993), 114-120.  doi: 10.1002/jcc.540140115.  Google Scholar

[12]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search techniques for Newton's method, SIAM J. Numer. Anal., 23 (1986), 707-716.  doi: 10.1137/0723046.  Google Scholar

[13]

W. W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16 (2005), 170-192.  doi: 10.1137/030601880.  Google Scholar

[14]

C. M. Ip and M. J. Todd, Optimal conditioning and convergence in rank one quasi-Newton updates, SIAM J. Numer. Anal., 25 (1988), 206-221.  doi: 10.1137/0725015.  Google Scholar

[15]

W. La CruzJ. Martinez and M. M. Raydan, Spectral residual method without gradient information for solving large-scale nonlinear systems of equations, Math. Comp., 75 (2006), 1429-1448.  doi: 10.1090/S0025-5718-06-01840-0.  Google Scholar

[16]

W. La Cruz and M. Raydan, Nonmonotone spectral methods for large-scale nonlinear systems, Optim. Methods Softw., 18 (2003), 583-599.  doi: 10.1080/10556780310001610493.  Google Scholar

[17]

W. J. Leong and M. A. Hassan, A restarting approach for the symmetric rank one update for unconstrained optimization, Comput. Optim. Appl., 42 (2009), 327-334.  doi: 10.1007/s10589-007-9115-z.  Google Scholar

[18]

D. C. Liu and J. Nocedal, On the limited memory BFGS method for large scale optimization, Math. Programming, 45 (1989), 503-528.  doi: 10.1007/BF01589116.  Google Scholar

[19]

Y. Liu and C. Storey, Efficient generalized conjugate gradient algorithms, Part I: Theory, J. Optim. Theory Appl., 69 (1991), 129-137.  doi: 10.1007/BF00940464.  Google Scholar

[20]

F. LuengoM. RaydanW. Glunt and T. L. Hayden, Preconditioned spectral gradient method, Numer. Algorithms, 30 (2002), 241-258.  doi: 10.1023/A:1020181927999.  Google Scholar

[21]

E. Polak and G. Ribiére, Note sur la convergence de directions conjugées, Rev. Francaise Informat Recherche Opertionelle, 3e Année, 16 (1969), 35-43.   Google Scholar

[22]

B. T. Polyak, The conjugate gradient method in extreme problems, USSR Computational Mathematics and Mathematical Physics, 9 (1969), 94-112.   Google Scholar

[23]

M. Raydan, The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM Journal on Optimization, 7 (1997), 26-33.  doi: 10.1137/S1052623494266365.  Google Scholar

[24]

H. S. SimW. J. Leong and C. Y. Chen, Gradient method with multiple damping for large-scale unconstrained optimization, Optim. Lett., 13 (2019), 617-632.  doi: 10.1007/s11590-018-1247-9.  Google Scholar

[25]

H. S. SimW. J. LeongC. Y. Chen and S. N. I. Ibrahim, Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization, Numer. Algebra Control Optim., 8 (2018), 387-397.  doi: 10.3934/naco.2018024.  Google Scholar

[26]

H. Wolkowicz, Measure for symmetric rank-one updates, Math. Oper. Res., 19 (1994), 815-830.  doi: 10.1287/moor.19.4.815.  Google Scholar

show all references

References:
[1]

N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim., 10 (2008), 147-161.   Google Scholar

[2]

J. Barzilai and J. Borwein, Two-point step size gradient methods, IMA J. Numer. Anal., 8 (1988), 141-148.   Google Scholar

[3]

I. BongartzA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: Constrained and unconstrained testing environment, ACM Transactions on Mathematical Software, 21 (1995), 123-160.  doi: 10.1145/200979.201043.  Google Scholar

[4]

R. H. Byrd and J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained minimization, SIAM J. Numer. Anal., 26 (1989), 727-739.  doi: 10.1137/0726042.  Google Scholar

[5]

A. Cauchy, Methodes generales pour la resolution des syst'emes dequations simultanees, Comp. Rend. Sci. Paris, 25 (1847), 536-538.   Google Scholar

[6]

Y. H. Dai and Y. X. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., 10 (1999), 177-182.  doi: 10.1137/S1052623497318992.  Google Scholar

[7]

J. E. Dennis and H. Wolkowicz, Sizing and least change secant methods, SIAM J. Numer. Anal., 30 (1993), 1291-1314.  doi: 10.1137/0730067.  Google Scholar

[8]

E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Math. Program., 91 (2002), 201-213.  doi: 10.1007/s101070100263.  Google Scholar

[9]

R. Fletcher, Practical Methods of Optimization: Unconstrained Optimization, 2$^{nd}$ edition, A Wiley-Interscience Publication. John Wiley & Sons, Ltd., Chichester, 1987.  Google Scholar

[10]

R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J., 7 (1964), 149-154.   Google Scholar

[11]

W. GluntT. L. Hayden and M. Raydan, Molecular conformations from distance matrices, Journal of Computational Chemistry, 14 (1993), 114-120.  doi: 10.1002/jcc.540140115.  Google Scholar

[12]

L. GrippoF. Lampariello and S. Lucidi, A nonmonotone line search techniques for Newton's method, SIAM J. Numer. Anal., 23 (1986), 707-716.  doi: 10.1137/0723046.  Google Scholar

[13]

W. W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16 (2005), 170-192.  doi: 10.1137/030601880.  Google Scholar

[14]

C. M. Ip and M. J. Todd, Optimal conditioning and convergence in rank one quasi-Newton updates, SIAM J. Numer. Anal., 25 (1988), 206-221.  doi: 10.1137/0725015.  Google Scholar

[15]

W. La CruzJ. Martinez and M. M. Raydan, Spectral residual method without gradient information for solving large-scale nonlinear systems of equations, Math. Comp., 75 (2006), 1429-1448.  doi: 10.1090/S0025-5718-06-01840-0.  Google Scholar

[16]

W. La Cruz and M. Raydan, Nonmonotone spectral methods for large-scale nonlinear systems, Optim. Methods Softw., 18 (2003), 583-599.  doi: 10.1080/10556780310001610493.  Google Scholar

[17]

W. J. Leong and M. A. Hassan, A restarting approach for the symmetric rank one update for unconstrained optimization, Comput. Optim. Appl., 42 (2009), 327-334.  doi: 10.1007/s10589-007-9115-z.  Google Scholar

[18]

D. C. Liu and J. Nocedal, On the limited memory BFGS method for large scale optimization, Math. Programming, 45 (1989), 503-528.  doi: 10.1007/BF01589116.  Google Scholar

[19]

Y. Liu and C. Storey, Efficient generalized conjugate gradient algorithms, Part I: Theory, J. Optim. Theory Appl., 69 (1991), 129-137.  doi: 10.1007/BF00940464.  Google Scholar

[20]

F. LuengoM. RaydanW. Glunt and T. L. Hayden, Preconditioned spectral gradient method, Numer. Algorithms, 30 (2002), 241-258.  doi: 10.1023/A:1020181927999.  Google Scholar

[21]

E. Polak and G. Ribiére, Note sur la convergence de directions conjugées, Rev. Francaise Informat Recherche Opertionelle, 3e Année, 16 (1969), 35-43.   Google Scholar

[22]

B. T. Polyak, The conjugate gradient method in extreme problems, USSR Computational Mathematics and Mathematical Physics, 9 (1969), 94-112.   Google Scholar

[23]

M. Raydan, The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM Journal on Optimization, 7 (1997), 26-33.  doi: 10.1137/S1052623494266365.  Google Scholar

[24]

H. S. SimW. J. Leong and C. Y. Chen, Gradient method with multiple damping for large-scale unconstrained optimization, Optim. Lett., 13 (2019), 617-632.  doi: 10.1007/s11590-018-1247-9.  Google Scholar

[25]

H. S. SimW. J. LeongC. Y. Chen and S. N. I. Ibrahim, Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization, Numer. Algebra Control Optim., 8 (2018), 387-397.  doi: 10.3934/naco.2018024.  Google Scholar

[26]

H. Wolkowicz, Measure for symmetric rank-one updates, Math. Oper. Res., 19 (1994), 815-830.  doi: 10.1287/moor.19.4.815.  Google Scholar

Figure 1.  SG-mSR1 vs BB in terms of function values.
Figure 2.  SG-mSR1 vs BB in terms of gradient norm.
Figure 3.  Performance profile comparing the efficiency of SG-mSR1 methods and the standard CG methods in terms of number of iterations
Figure 4.  Performance profile comparing the efficiency of SG-mSR1 methods and the standard CG methods in terms of number of function calls
Figure 5.  Performance profile comparing the efficiency of SG-mSR1 methods and the standard CG methods in terms of CPU time
[1]

Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe. Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization. Journal of Industrial & Management Optimization, 2019, 15 (4) : 1773-1793. doi: 10.3934/jimo.2018122

[2]

Rouhollah Tavakoli, Hongchao Zhang. A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numerical Algebra, Control & Optimization, 2012, 2 (2) : 395-412. doi: 10.3934/naco.2012.2.395

[3]

Min Li. A three term Polak-Ribière-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial & Management Optimization, 2020, 16 (1) : 245-260. doi: 10.3934/jimo.2018149

[4]

Saman Babaie–Kafaki, Reza Ghanbari. A class of descent four–term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. Journal of Industrial & Management Optimization, 2017, 13 (2) : 649-658. doi: 10.3934/jimo.2016038

[5]

Hong Seng Sim, Wah June Leong, Chuei Yee Chen, Siti Nur Iqmal Ibrahim. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization. Numerical Algebra, Control & Optimization, 2018, 8 (3) : 377-387. doi: 10.3934/naco.2018024

[6]

Zhong Wan, Chaoming Hu, Zhanlu Yang. A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search. Discrete & Continuous Dynamical Systems - B, 2011, 16 (4) : 1157-1169. doi: 10.3934/dcdsb.2011.16.1157

[7]

Honglan Zhu, Qin Ni, Meilan Zeng. A quasi-Newton trust region method based on a new fractional model. Numerical Algebra, Control & Optimization, 2015, 5 (3) : 237-249. doi: 10.3934/naco.2015.5.237

[8]

Lijuan Zhao, Wenyu Sun. Nonmonotone retrospective conic trust region method for unconstrained optimization. Numerical Algebra, Control & Optimization, 2013, 3 (2) : 309-325. doi: 10.3934/naco.2013.3.309

[9]

Regina S. Burachik, C. Yalçın Kaya. An update rule and a convergence result for a penalty function method. Journal of Industrial & Management Optimization, 2007, 3 (2) : 381-398. doi: 10.3934/jimo.2007.3.381

[10]

Boris Kramer, John R. Singler. A POD projection method for large-scale algebraic Riccati equations. Numerical Algebra, Control & Optimization, 2016, 6 (4) : 413-435. doi: 10.3934/naco.2016018

[11]

Wanbin Tong, Hongjin He, Chen Ling, Liqun Qi. A nonmonotone spectral projected gradient method for tensor eigenvalue complementarity problems. Numerical Algebra, Control & Optimization, 2020, 10 (4) : 425-437. doi: 10.3934/naco.2020042

[12]

Mao Okada. Local rigidity of certain actions of solvable groups on the boundaries of rank-one symmetric spaces. Journal of Modern Dynamics, 2021, 17: 111-143. doi: 10.3934/jmd.2021004

[13]

M. S. Lee, H. G. Harno, B. S. Goh, K. H. Lim. On the bang-bang control approach via a component-wise line search strategy for unconstrained optimization. Numerical Algebra, Control & Optimization, 2021, 11 (1) : 45-61. doi: 10.3934/naco.2020014

[14]

Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034

[15]

Manfred Einsiedler, Elon Lindenstrauss. On measures invariant under diagonalizable actions: the Rank-One case and the general Low-Entropy method. Journal of Modern Dynamics, 2008, 2 (1) : 83-128. doi: 10.3934/jmd.2008.2.83

[16]

Basim A. Hassan. A new type of quasi-newton updating formulas based on the new quasi-newton equation. Numerical Algebra, Control & Optimization, 2020, 10 (2) : 227-235. doi: 10.3934/naco.2019049

[17]

Matthias Gerdts, Stefan Horn, Sven-Joachim Kimmerle. Line search globalization of a semismooth Newton method for operator equations in Hilbert spaces with applications in optimal control. Journal of Industrial & Management Optimization, 2017, 13 (1) : 47-62. doi: 10.3934/jimo.2016003

[18]

Gaohang Yu. A derivative-free method for solving large-scale nonlinear systems of equations. Journal of Industrial & Management Optimization, 2010, 6 (1) : 149-160. doi: 10.3934/jimo.2010.6.149

[19]

Yigui Ou, Wenjie Xu. A unified derivative-free projection method model for large-scale nonlinear equations with convex constraints. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021125

[20]

Xianchao Xiu, Lingchen Kong. Rank-one and sparse matrix decomposition for dynamic MRI. Numerical Algebra, Control & Optimization, 2015, 5 (2) : 127-134. doi: 10.3934/naco.2015.5.127

2020 Impact Factor: 1.801

Article outline

Figures and Tables

[Back to Top]