• PDF
• Cite
• Share
Article Contents  Article Contents

# Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling

• In a Hilbert setting, we develop fast methods for convex unconstrained optimization. We rely on the asymptotic behavior of an inertial system combining geometric damping with temporal scaling. The convex function to minimize enters the dynamic via its gradient. The dynamic includes three coefficients varying with time, one is a viscous damping coefficient, the second is attached to the Hessian-driven damping, the third is a time scaling coefficient. We study the convergence rate of the values under general conditions involving the damping and the time scale coefficients. The obtained results are based on a new Lyapunov analysis and they encompass known results on the subject. We pay particular attention to the case of an asymptotically vanishing viscous damping, which is directly related to the accelerated gradient method of Nesterov. The Hessian-driven damping significantly reduces the oscillatory aspects. We obtain an exponential rate of convergence of values without assuming the strong convexity of the objective function. The temporal discretization of these dynamics opens the gate to a large class of inertial optimization algorithms.

Mathematics Subject Classification: 37N40, 46N10, 49M15, 65K05, 65K10, 90C25, 93D05.

 Citation: • • Figure 1.  $t \mapsto f (x(t)) - \min f$ for solutions of (51), (52), $f(x_1,x_2) = \frac12\left(x_1^2+x_2^2\right)-\ln(x_1x_2)$

Figure 2.  Convergence rate of $f(x(t))-\min f$ for instances of Theorem 2.1 and general $f$

Figure 3.  Evolution of $f (x(t)) - \min f$ for systems in Figure 2, and $f(x_1,x_2) = \frac12\left(x_1^2+10^3x_2^2\right)$

• ## Article Metrics  DownLoad:  Full-Size Img  PowerPoint