\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization

  • * Corresponding author: Shummin Nakayama

    * Corresponding author: Shummin Nakayama 
Abstract Full Text(HTML) Figure(4) / Table(5) Related Papers Cited by
  • Memoryless quasi-Newton methods are studied for solving large-scale unconstrained optimization problems. Recently, memoryless quasi-Newton methods based on several kinds of updating formulas were proposed. Since the methods closely related to the conjugate gradient method, the methods are promising. In this paper, we propose a memoryless quasi-Newton method based on the Broyden family with the spectral-scaling secant condition. We focus on the convex and preconvex classes of the Broyden family, and we show that the proposed method satisfies the sufficient descent condition and converges globally. Finally, some numerical experiments are given.

    Mathematics Subject Classification: Primary: 90C30, 90C06; Secondary: 65K05.

    Citation:

    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  Performance profiles of methods in Table 2

    Figure 2.  Performance profiles of methods in Table 3

    Figure 3.  Performance profiles of methods in Table 4

    Figure 4.  Performance profiles of methods in Table 5

    Table 1.  Test problems (names and dimensions) by CUTEr library

    name n name n name n name n
    AKIVA 2 DIXMAANC 3000 HEART8LS 8 PENALTY1 1000
    ALLINITU 4 DIXMAAND 3000 HELIX 3 PENALTY2 200
    ARGLINA 200 DIXMAANE 3000 HIELOW 3 PENALTY3 200
    ARGLINB 200 DIXMAANF 3000 HILBERTA 2 POWELLSG 5000
    ARWHEAD 5000 DIXMAANG 3000 HILBERTB 10 POWER 10000
    BARD 3 DIXMAANH 3000 HIMMELBB 2 QUARTC 5000
    BDQRTIC 5000 DIXMAANI 3000 HIMMELBF 4 ROSENBR 2
    BEALE 2 DIXMAANJ 3000 HIMMELBG 2 S308 2
    BIGGS6 6 DIXMAANK 3000 HIMMELBH 2 SCHMVETT 5000
    BOX3 3 DIXMAANL 3000 HUMPS 2 SENSORS 100
    BOX 10000 DIXON3DQ 10000 JENSMP 2 SINEVAL 2
    BRKMCC 2 DJTL 2 KOWOSB 4 SINQUAD 5000
    BROWNAL 200 DQDRTIC 5000 LIARWHD 5000 SISSER 2
    BROWNBS 2 DQRTIC 5000 LOGHAIRY 2 SNAIL 2
    BROWNDEN 4 EDENSCH 2000 MANCINO 100 SPARSINE 5000
    BROYDN7D 5000 EG2 1000 MARATOSB 2 SPARSQUR 10000
    BRYBND 5000 ENGVAL1 5000 MEXHAT 2 SPMSRTLS 4999
    CHAINWOO 4000 ENGVAL2 3 MOREBV 5000 SROSENBR 5000
    CHNROSNB 50 ERRINROS 50 MSQRTALS 1024 STRATEC 10
    CLIFF 2 EXPFIT 2 MSQRTBLS 1024 TESTQUAD 5000
    COSINE 10000 EXTROSNB 1000 NONCVXU2 5000 TOINTGOR 50
    CRAGGLVY 5000 FLETCBV2 5000 NONDIA 5000 TOINTGSS 5000
    CUBE 2 FLETCHCR 1000 NONDQUAR 5000 TOINTPSP 50
    CURLY10 10000 FMINSRF2 5625 OSBORNEA 5 TOINTQOR 50
    CURLY20 10000 FMINSURF 5625 OSBORNEB 11 TQUARTIC 5000
    CURLY30 10000 FREUROTH 5000 OSCIPATH 10 TRIDIA 5000
    DECONVU 63 GENHUMPS 5000 PALMER1C 8 VARDIM 200
    DENSCHNA 2 GENROSE 500 PALMER1D 7 VAREIGVL 50
    DENSCHNB 2 GROWTHLS 3 PALMER2C 8 VIBRBEAM 8
    DENSCHNC 2 GULF 3 PALMER3C 8 WATSON 12
    DENSCHND 3 HAIRY 2 PALMER4C 8 WOODS 4000
    DENSCHNE 3 HATFLDD 3 PALMER5C 6 YFITU 3
    DENSCHNF 2 HATFLDE 3 PALMER6C 8 ZANGWIL2 2
    DIXMAANA 3000 HATFLDFL 3 PALMER7C 8
    DIXMAANB 3000 HEART6LS 6 PALMER8C 8
     | Show Table
    DownLoad: CSV

    Table 2.  Tested methods (Methods 1-9)

    Method number $ \theta_{k-1}$ Class Global convergence
    1 (DFP) 0 convex not established
    2 0.25 convex established
    3 0.5 convex established
    4 0.75 convex established
    5 (BFGS) 1 convex established
    6 1.25 preconvex established
    7 1.5 preconvex established
    8 1.75 preconvex established
    9 2 preconvex not established
     | Show Table
    DownLoad: CSV

    Table 3.  Tested methods (Methods 5 and 10-17)

    Method number $ \theta_{k-1}$ Class Global convergence
    (BFGS) 1 convex established
    10 0.8 convex established
    11 0.85 convex established
    12 0.9 convex established
    13 0.95 convex established
    14 1.05 preconvex established
    15 1.1 preconvex established
    16 1.15 preconvex established
    17 1.2 preconvex established
     | Show Table
    DownLoad: CSV

    Table 4.  Tested methods (Methods 5 and 18-21)

    Method number $\hat\theta_{k-1}$ Class Global convergence
    5 (BFGS) 1 convex established
    18 (Hoshino) $\dfrac{s_{k-1}^Ty_{k-1}}{(s_{k-1} + \hat\gamma_{k-1}^{-1}y_{k-1})^T y_{k-1}}$ convex established
    19 $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right|$ preconvex established
    20 $1 + \dfrac{ d_{k-1}^Tg_k }{\|d_{k-1}\| \|g_{k}\|}$ unknown established
    21 $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ preconvex established
     | Show Table
    DownLoad: CSV

    Table 5.  Tested methods (Methods 5, 19 and 21-25)

    Method number $\hat\gamma_{k-1}$ $\hat\theta_{k-1}$ Global convergence
    5 (BFGS) (54) 1 established
    19 (54) $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right| $ established
    21 (54) $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ established
    22 (BFGS) (56) 1 not established
    23 (56) $1 + \left|\dfrac{d_{k-1}^Tg_k}{d_{k-1}^Tg_{k-1}}\right|$ not established
    24 (56) $1 + \dfrac{ |d_{k-1}^Tg_k| }{\|d_{k-1}\| \|g_{k}\|}$ not established
    25 (CG_DESCENT) [11,12,14] established
     | Show Table
    DownLoad: CSV
  • [1] M. Al-BaaliY. Narushima and H. Yabe, A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization, Computational Optimization and Applications, 60 (2015), 89-110.  doi: 10.1007/s10589-014-9662-z.
    [2] I. BongartA. R. ConnN. I. M. Gould and P. L. Toint, CUTE: constrained and unconstrained testing environment, ACM Transactions on Mathematical Software, 21 (1995), 123-160.  doi: 10.1145/200979.201043.
    [3] Z. Chen and W. Cheng, Spectral-scaling quasi-Newton methods with updates from the one parameter of the Broyden family, Journal of Computational and Applied Mathematics, 248 (2013), 88-98.  doi: 10.1016/j.cam.2013.01.012.
    [4] W. Y. Cheng and D. H. Li, Spectral scaling BFGS method, Journal of Optimization Theory and Applications, 146 (2010), 305-319.  doi: 10.1007/s10957-010-9652-y.
    [5] Y. H. Dai and C. X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM Journal on Optimization, 23 (2013), 296-320.  doi: 10.1137/100813026.
    [6] Y. H. Dai and L. Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods, Applied Mathematics and Optimization, 43 (2001), 87-101.  doi: 10.1007/s002450010019.
    [7] E. D. Dolan and J. J. Moré, Benchmarking optimization software with performance profiles, Mathematical Programming, 91 (2002), 201-213.  doi: 10.1007/s101070100263.
    [8] J. A. Ford and I. A. Moghrabi, Multi-step quasi-Newton methods for optimization, Journal of Computational and Applied Mathematics, 50 (1994), 305-323.  doi: 10.1016/0377-0427(94)90309-3.
    [9] J. C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization, 2 (1992), 21-42.  doi: 10.1137/0802003.
    [10] N.I.M GouldD. Orban and P.L. Toint, CUTEr and SifDec: A constrained and unconstrained testing environment, revisited, ACM Transactions on Mathematical Software, 29 (2003), 373-394.  doi: 10.1145/962437.962439.
    [11] W. W. Hager, Hager's web page: https://people.clas.ufl.edu/hager/.
    [12] W. W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16 (2005), 170-192.  doi: 10.1137/030601880.
    [13] W. W. Hager and H. Zhang, A survey of nonlinear conjugate gradient method, Pacific Journal of Optimization, 2 (2006), 35-58. 
    [14] W. W. Hager and H. Zhang, Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent, ACM Transactions on Mathematical Software, 32 (2006), 113-137.  doi: 10.1145/1132973.1132979.
    [15] M. R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards, 49 (1952), 409-436.  doi: 10.6028/jres.049.044.
    [16] S. Hoshino, A formulation of variable metric methods, IMA Journal of Applied Mathematics, 10 (1972), 394-403.  doi: 10.1093/imamat/10.3.394.
    [17] C. X. Kou and Y. H. Dai, A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization, Journal of Optimization Theory and Applications, 165 (2015), 209-224.  doi: 10.1007/s10957-014-0528-4.
    [18] D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35.  doi: 10.1016/S0377-0427(00)00540-9.
    [19] F. ModarresM. A. Hassan and W. J. Leong, Memoryless modified symmetric rank-one method for large-scale unconstrained optimization, American Journal of Applied Sciences, 6 (2009), 2054-2059. 
    [20] A.U. Moyi and W.J. Leong, A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization, Optimization, 65 (2016), 121-143.  doi: 10.1080/02331934.2014.994625.
    [21] S. NakayamaY. Narushima and H. Yabe, A memoryless symmetric rank-one method with sufficient descent property for unconstrained optimization, Journal of the Operations Research Society of Japan, 61 (2018), 53-70.  doi: 10.15807/jorsj.61.53.
    [22] Y. Narushima and H. Yabe, A survey of sufficient descent conjugate gradient methods for unconstrained optimization, SUT Journal of Mathematics, 50 (2014), 167-203. 
    [23] Y. NarushimaH. Yabe and J. A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21 (2011), 212-230.  doi: 10.1137/080743573.
    [24] J. Nocedal and S. J. Wright, Numerical Optimization, 2nd edition, Springer, 2006.
    [25] S. S. Oren, Self-scaling variable metric (SSVM) algorithms, Part Ⅱ: Implementation and experiments, Management Science, 20 (1974), 863-874.  doi: 10.1287/mnsc.20.5.863.
    [26] S. S. Oren and D. G. Luenberger, Self-scaling variable metric (SSVM) algorithms, Part Ⅰ: Criteria and sufficient conditions for scaling a class of algorithms, Management Science, 20 (1974), 845-862.  doi: 10.1287/mnsc.20.5.845.
    [27] D. F. Shanno, Conjugate gradient methods with inexact searches, Mathematics of Operations Research, 3 (1978), 244-256.  doi: 10.1287/moor.3.3.244.
    [28] K. SugikiY. Narushima and H. Yabe, Globally convergent three-term conjugate gradient methods that use secant condition and generate descent search directions for unconstrained optimization, Journal of Optimization Theory and Applications, 153 (2012), 733-757.  doi: 10.1007/s10957-011-9960-x.
    [29] L. Sun, An approach to scaling symmetric rank-one update, Pacific Journal of Optimization, 2 (2006), 105-118. 
    [30] W. Sun and Y. Yuan, Optimization Theory and Methods: Nonlinear Programming, Springer, 2006.
    [31] Z. WeiG. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems, Applied Mathematics and Computation, 175 (2006), 1156-1188.  doi: 10.1016/j.amc.2005.08.027.
    [32] J. Z. ZhangN. Y. Deng and L. H. Chen, New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102 (1999), 147-167.  doi: 10.1023/A:1021898630001.
    [33] L. ZhangW. Zhou and D. H. Li, A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence, IMA Journal of Numerical Analysis, 26 (2006), 629-640.  doi: 10.1093/imanum/drl016.
    [34] Y. Zhang and R. P. Tewarson, Quasi-Newton algorithms with updates from the preconvex part of Broyden's family, IMA Journal of Numerical Analysis, 8 (1988), 487-509.  doi: 10.1093/imanum/8.4.487.
  • 加载中

Figures(4)

Tables(5)

SHARE

Article Metrics

HTML views(1822) PDF downloads(430) Cited by(0)

Access History

Other Articles By Authors

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return