\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

Convergence analysis of sparse quasi-Newton updates with positive definite matrix completion for two-dimensional functions

Abstract / Introduction Related Papers Cited by
  • In this paper, we briefly review the extensions of quasi-Newton methods for large-scale optimization. Specially, based on the idea of maximum determinant positive definite matrix completion, Yamashita (2008) proposed a new sparse quasi-Newton update, called MCQN, for unconstrained optimization problems with sparse Hessian structures. In exchange of the relaxation of the secant equation, the MCQN update avoids solving difficult subproblems and overcomes the ill-conditioning of approximate Hessian matrices. A global convergence analysis is given in this paper for the MCQN update with Broyden's convex family assuming that the objective function is uniformly convex and its dimension is only two.
        This paper is dedicated to Professor Masao Fukushima on occasion of his 60th birthday.
    Mathematics Subject Classification: Primary: 90C53, 90C06.

    Citation:

    \begin{equation} \\ \end{equation}
  • [1]

    R. Byrd and J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained optimization, SIAM Journal on Numerical Analysis, 26 (1989), 727-739.doi: 10.1137/0726042.

    [2]

    R. Byrd, J. Nocedal and Y. Yuan, Global convergence of a class of quasi-Newton methods on convex problems, SIAM Journal on Numerical Analysis, 24 (1987), 1171-1190.doi: 10.1137/0724077.

    [3]

    M. H. Cheng and Y. H. Dai, A new sparse quasi-Newton update method, Research report, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, 2010.

    [4]

    Y. H. Dai, Convergence properties of the BFGS algorithm, SIAM Journal on Optimization, 13 (2003), 693-701.doi: 10.1137/S1052623401383455.

    [5]

    Y. H. Dai, A perfect example for the BFGS method, Research report, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, 2010.

    [6]

    Y. H. Dai and N. Yamashita, Analysis of sparse quasi-Newton updates with positive definite matrix completion, Research report, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, 2007.

    [7]

    J. E. Dennis, Jr. and J. J. Moré, Quasi-Newton method, motivation and theory, SIAM Review, 19 (1977), 46-89.doi: 10.1137/1019005.

    [8]

    R. Fletcher, An optimal positive definite update for sparse Hessian matrices, SIAM Journal on Optimization, 5 (1995), 192-218.doi: 10.1137/0805010.

    [9]

    M. Fukuda, M. Kojima, K. Murota and K. Nakata, Exploiting sparsity in semidefinite programming via matrix completion I: General frameworks, SIAM Journal on Optimization, 11 (2000), 647-674.doi: 10.1137/S1052623400366218.

    [10]

    A. Griewank and Ph. L. Toint, On the unconstrained optimization of partially separable objective functions, in "Nonlinear Optimization 1981" (ed. M. J. D. Powell), Academic Press (London), (1982), 301-312.

    [11]

    D. H. Li and M. Fukushima, On the global convergence of BFGS method for nonconvex unconstrained optimization problems, SIAM Journal on Optimization, 11 (2001), 1054-1064.doi: 10.1137/S1052623499354242.

    [12]

    D. H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129 (2001), 15-35.doi: 10.1016/S0377-0427(00)00540-9.

    [13]

    D. C. Liu and J. Nocedal, On the limited memory BFGS method for large scale optimization, Mathematical Programming, 45 (1989), 503-528.doi: 10.1007/BF01589116.

    [14]

    W. F. Mascarenhas, The BFGS algorithm with exact line searches fails for nonconvex functions, Mathematical Programming, 99 (2004), 49-61.doi: 10.1007/s10107-003-0421-7.

    [15]

    J. Nocedal, Updating quasi-Newton matrices with limited storage, Mathematics of Computation, 35 (1980), 773-782.doi: 10.1090/S0025-5718-1980-0572855-7.

    [16]

    M. J. D. Powell, Some global convergence properties of a variable metric algorithm for minimization without exact line searches, in "Nonlinear Programming, SIAM-AMS Proceedings Vol. IX" (eds. R. W. Cottle and C. E. Lemke), SIAM, Philadelphia, PA, (1976), 53-72.

    [17]

    D. C. Sorensen, Collinear scaling and sequential estimation in sparse optimization algorithm, Mathematical Programming Study, 18 (1982), 135-159.

    [18]

    P. L. Toint, On sparse and symmetric matrix updating subject to a linear equation, Mathematics of Computation, 31 (1977), 954-961.doi: 10.1090/S0025-5718-1977-0455338-4.

    [19]

    N. Yamashita, Sparse quasi-Newton updates with positive definite matrix completion, Mathematical Programming, 115 (2008), 1-30.doi: 10.1007/s10107-007-0137-1.

    [20]

    Y. Yuan, "Numerical Methods for Nonlinear Programming," Shanghai Scientific & Technical Publishers, 1993 (in Chinese).

  • 加载中
SHARE

Article Metrics

HTML views() PDF downloads(116) Cited by(0)

Access History

Other Articles By Authors

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return