EUROPT 2024
Abstract Submission

197. Near-optimal Closed-loop Method via Lyapunov Damping for Convex Optimization

Invited abstract in session TD-7: Accelerated methods in modern optimization, stream Optimization for Inverse Problems and Machine Learning.

Thursday, 14:10 - 15:50
Room: M:I

Authors (first author is the speaker)

1. Camille Castera
Mathematical Institute, University of Bordeaux
2. Severin Maier
University of Tübingen
3. Peter Ochs
Saarland University

Abstract

Optimal rates of convergence for first-order convex optimization are achieved via so-called open-loop damping strategies. In particular, Nesterov's algorithm features the intriguing momentum parameter (k-1)/(k+2) that depends on the iteration index k, making the initial iteration index an hyper-parameter that affects the performance of the algorithm.
From a dynamical system perspective, we overcome this issue by designing the momentum (or damping) parameter in a closed-loop manner, using a Lyapunov function. This creates a feedback loop between the momentum and the speed of convergence of the system. We show that the resulting method achieves a convergence rate arbitrarily close to the optimal one while also getting rid of an additional hyper-parameter.

Keywords

Status: accepted


Back to the list of papers