This issuePrevious ArticleExponential attractors for a quasilinear parabolic equationNext ArticleNavier-Stokes problems modeled by evolution hemivariational inequalities
Global convergence of a memory gradient method with closed-form step size formula
The memory gradient method is used to solve large scale unconstrained optimization problems. We investigate a closed-form step size formula given by a finite number of iterates ofWeiszfeld’s algorithm to compute the step size for a memory gradient method. This formula can be classified as a no-line search procedure since no stopping criteria is involved to ensure convergence, unlike the classical line search procedures. We show the global convergence of the memory gradient method, under weaker conditions.