EURO 2024 Copenhagen
Abstract Submission

EURO-Online login

1718. A globally convergent gradient method with momentum

Invited abstract in session MB-32: Algorithmic Advances in Large Scale Nonconvex Optimization, stream Advances in large scale nonlinear optimization.

Monday, 10:30-12:00
Room: 41 (building: 303A)

Authors (first author is the speaker)

1. Matteo Lapucci
Department of Information Engineering, University of Florence
2. Giampaolo Liuzzi
DIAG, Sapienza Univ. of Rome
3. Stefano Lucidi
Department of Computer, Control, and Management Science, University of Rome "La Sapienza"

Abstract

Among many effective strategies to speed up first-order solvers for smooth nonlinear optimization problems, Polyak's momentum (or Heavy-Ball) certainly stands out. The idea of Polyak consists of exploiting information about past iterates, carrying out a step along a direction which is a linear combination of a suitable gradient-related descent direction and the previous step. The idea of this update rule is that, partly following the direction of last iteration, oscillations can be controlled and acceleration can be obtained in low curvature regions.
However, except for the quadratic case, the selection of the two parameters defining the update is not trivial; to this aim, we develop an approach where these coefficients are simultaneously determined by a simple inexact bidimensional search. This strategy allows us to define a class of gradient methods with momentum enjoying, under reasonably weak assumptions, global convergence guarantees and an optimal worst-case complexity bound in the nonconvex setting. To the best of our knowledge, these results are novel to the literature. Moreover, extensive computational experiments show that the gradient methods with momentum here presented outperform conjugate gradient methods and are (at least) competitive with the state-of-art method for unconstrained optimization, i.e, L-BFGS method.


Keywords

Status: accepted


Back to the list of papers