EURO 2024 Copenhagen
Abstract Submission

EURO-Online login

2532. Adaptive linesearch-free proximal gradient methods under Hölder gradient continuity

Invited abstract in session WA-41: Convex optimization algorithms, stream Nonsmooth Optimization.

Wednesday, 8:30-10:00
Room: 97 (building: 306)

Authors (first author is the speaker)

1. Puya Latafat
Electrical Engineering (ESAT) STADIUS Center for Dynamical Systems, Signal Processing and Data Analytics, KU LEuven
2. Konstantinos Oikonomidis
Dept. Electrical Engineering (ESAT), KU Leuven
3. Emanuel Laude
KU Leuven
4. Andreas Themelis
Information Science and Electrical Engineering, Kyushu University
5. Panagiotis Patrinos
Electrical Engineering, KU Leuven

Abstract

This work studies a class of adaptive methods for structured convex optimization that use large stepsizes on par with linesearch methods without employing any backtracking procedure. We show that the convergence of such methods extends beyond Lipschitzian setting and encompasses the case of local Hölder gradient continuity. Unlike conventional approaches that often resort to epsilon-oracles or linesearch procedures to address the absence of local Lipschitz continuity, we leverage plain Hölder inequalities without approximation or linesearch.
Our analysis establishes exact convergence results without the need for prior knowledge of local Hölder constants or the order of Hölder continuity.

Keywords

Status: accepted


Back to the list of papers