Article Contents
Article Contents

# Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling

• In a Hilbert setting, we develop fast methods for convex unconstrained optimization. We rely on the asymptotic behavior of an inertial system combining geometric damping with temporal scaling. The convex function to minimize enters the dynamic via its gradient. The dynamic includes three coefficients varying with time, one is a viscous damping coefficient, the second is attached to the Hessian-driven damping, the third is a time scaling coefficient. We study the convergence rate of the values under general conditions involving the damping and the time scale coefficients. The obtained results are based on a new Lyapunov analysis and they encompass known results on the subject. We pay particular attention to the case of an asymptotically vanishing viscous damping, which is directly related to the accelerated gradient method of Nesterov. The Hessian-driven damping significantly reduces the oscillatory aspects. We obtain an exponential rate of convergence of values without assuming the strong convexity of the objective function. The temporal discretization of these dynamics opens the gate to a large class of inertial optimization algorithms.

Mathematics Subject Classification: 37N40, 46N10, 49M15, 65K05, 65K10, 90C25, 93D05.

 Citation:

• Figure 1.  $t \mapsto f (x(t)) - \min f$ for solutions of (51), (52), $f(x_1,x_2) = \frac12\left(x_1^2+x_2^2\right)-\ln(x_1x_2)$

Figure 2.  Convergence rate of $f(x(t))-\min f$ for instances of Theorem 2.1 and general $f$

Figure 3.  Evolution of $f (x(t)) - \min f$ for systems in Figure 2, and $f(x_1,x_2) = \frac12\left(x_1^2+10^3x_2^2\right)$

•  [1] F. Alvarez, On the minimizing property of a second-order dissipative system in Hilbert spaces, SIAM J. Control Optim., 38 (2000), 1102-1119.  doi: 10.1137/S0363012998335802. [2] F. Alvarez, H. Attouch, J. Bolte and P. Redont, A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics, J. Math. Pures Appl. (9), 81 (2002), 747-779.  doi: 10.1016/S0021-7824(01)01253-3. [3] V. Apidopoulos, J.-F. Aujol and C. Dossal, Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule, Math. Program., 180 (2020), 137-156.  doi: 10.1007/s10107-018-1350-9. [4] H. Attouch and A. Cabot, Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity, J. Differential Equations, 263 (2017), 5412-5458.  doi: 10.1016/j.jde.2017.06.024. [5] H. Attouch and A. Cabot, Convergence rates of inertial forward-backward algorithms, SIAM J. Optim., 28 (2018), 849-874.  doi: 10.1137/17M1114739. [6] H. Attouch, A. Cabot, Z. Chbani and H. Riahi, Inertial forward-backward algorithms with perturbations: Application to Tikhonov regularization, J. Optim. Theory Appl., 179 (2018), 1-36.  doi: 10.1007/s10957-018-1369-3. [7] H. Attouch, A. Cabot, Z. Chbani and H. Riahi, Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient, Evol. Equ. Control Theory, 7 (2018), 353-371.  doi: 10.3934/eect.2018018. [8] H. Attouch, Z. Chbani, J. Fadili and H. Riahi, First-order optimization algorithms via inertial systems with Hessian driven damping, Math. Program., (2020). doi: 10.1007/s10107-020-01591-1. [9] H. Attouch, Z. Chbani, J. Peypouquet and P. Redont, Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity, Math. Program., 168 (2018), 123-175.  doi: 10.1007/s10107-016-0992-8. [10] H. Attouch, Z. Chbani and H. Riahi, Convergence rates of inertial proximal algorithms with general extrapolation and proximal coefficients, Vietnam J. Math., 48 (2020), 247-276.  doi: 10.1007/s10013-020-00399-y. [11] H. Attouch, Z. Chbani and H. Riahi, Fast convex optimization via time scaling of damped inertial gradient dynamics, Pure Appl. Functional Anal., 6 (2021). [12] H. Attouch, Z. Chbani and H. Riahi, Fast proximal methods via time scaling of damped inertial dynamics, SIAM J. Optim., 29 (2019), 2227-2256.  doi: 10.1137/18M1230207. [13] H. Attouch, Z. Chbani and H. Riahi, Rate of convergence of the Nesterov accelerated gradient method in the subcritical case $\alpha \leq3$, ESAIM Control Optim. Calc. Var., 25 (2019), 34pp. doi: 10.1051/cocv/2017083. [14] H. Attouch, X. Goudou and P. Redont, The heavy ball with friction method. I. The continuous dynamical system: Global exploration of the local minima of a real-valued function by asymptotic analysis of a dissipative dynamical system, Commun. Contemp. Math., 2 (2000), 1-34.  doi: 10.1142/S0219199700000025. [15] H. Attouch and S. C. László, Newton-like inertial dynamics and proximal algorithms governed by maximally monotone operators, SIAM J. Optim., 30 (2020), 3252-3283.  doi: 10.1137/20M1333316. [16] H. Attouch and J. Peypouquet, The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than $1/k^2$, SIAM J. Optim., 26 (2016), 1824-1834.  doi: 10.1137/15M1046095. [17] H. Attouch, J. Peypouquet and P. Redont, Fast convex optimization via inertial dynamics with Hessian driven damping, J. Differential Equations, 261 (2016), 5734-5783.  doi: 10.1016/j.jde.2016.08.020. [18] A. Beck and M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2 (2009), 183-202.  doi: 10.1137/080716542. [19] R. I. Boţ and E. R. Csetnek, Second order forward-backward dynamical systems for monotone inclusion problems, SIAM J. Control Optim., 54 (2016), 1423-1443.  doi: 10.1137/15M1012657. [20] R. I. Boţ, E. R. Csetnek and S. C. László, Approaching nonsmooth nonconvex minimization through second-order proximal-gradient dynamical systems, J. Evol. Equ., 18 (2018), 1291-1318.  doi: 10.1007/s00028-018-0441-7. [21] R. I. Boţ, E. R. Csetnek and S. C. László, Tikhonov regularization of a second order dynamical system with Hessian damping, Math. Program., (2020). doi: 10.1007/s10107-020-01528-8. [22] O. Güler, New proximal point algorithms for convex minimization, SIAM J. Optim., 2 (1992), 649-664.  doi: 10.1137/0802032. [23] O. Güler, On the convergence of the proximal point algorithm for convex minimization, SIAM J. Control Optim., 29 (1991), 403-419.  doi: 10.1137/0329022. [24] A. Haraux, Systèmes Dynamiques Dissipatifs et Applications, Recherches en Mathématiques Appliquées, 17, Masson, Paris, 1991. [25] A. Haraux and M. A. Jendoubi, The Convergence Problem for Dissipative Autonomous Systems. Classical Methods and Recent Advances, SpringerBriefs in Mathematics, BCAM SpringerBriefs, Springer, Cham; BCAM Basque Center for Applied Mathematics, Bilbao, 2015. doi: 10.1007/978-3-319-23407-6. [26] R. May, Asymptotic for a second-order evolution equation with convex potential and vanishing damping term, Turkish J. Math., 41 (2017), 681-685.  doi: 10.3906/mat-1512-28. [27] Y. Nesterov, A method of solving a convex programming problem with convergence rate $\mathcal O(1/k2)$, Soviet Math. Doklady, 27 (1983), 372-376. [28] Y. Nesterov, Introductory Lectures on Convex Optimization. A Basic Course, Applied Optimization, 87, Kluwer Academic Publishers, Boston, MA, 2004. doi: 10.1007/978-1-4419-8853-9. [29] J. Peypouquet and S. Sorin, Evolution equations for maximal monotone operators: Asymptotic analysis in continuous and discrete time, J. Convex Anal., 17 (2010), 1113-1163. [30] B. T. Polyak, Some methods of speeding up the convergence of iteration methods, USSR Comput. Math. Math. Phys., 4 (1964), 1-17.  doi: 10.1016/0041-5553(64)90137-5. [31] B. Shi, S. S. Du, M. I. Jordan and W. J. Su, Understanding the acceleration phenomenon via high-resolution differential equations, Math. Program., 2021 doi: 10.1007/s10107-021-01681-8. [32] J. W. Siegel, Accelerated first-order methods: Differential equations and Lyapunov functions, preprint, arXiv: 1903.05671v1. [33] W. Su, S. Boyd and E. J. Candès, A differential equation for modeling Nesterov's accelerated gradient method: Theory and insights, J. Mach. Learn. Res., 17 (2016), 43pp.

Figures(3)