EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
2532. Adaptive linesearch-free proximal gradient methods under Hölder gradient continuity
Invited abstract in session WA-41: Convex optimization algorithms, stream Nonsmooth Optimization.
Wednesday, 8:30-10:00Room: 97 (building: 306)
Authors (first author is the speaker)
1. | Puya Latafat
|
Electrical Engineering (ESAT) STADIUS Center for Dynamical Systems, Signal Processing and Data Analytics, KU LEuven | |
2. | Konstantinos Oikonomidis
|
Dept. Electrical Engineering (ESAT), KU Leuven | |
3. | Emanuel Laude
|
KU Leuven | |
4. | Andreas Themelis
|
Information Science and Electrical Engineering, Kyushu University | |
5. | Panagiotis Patrinos
|
Electrical Engineering, KU Leuven |
Abstract
This work studies a class of adaptive methods for structured convex optimization that use large stepsizes on par with linesearch methods without employing any backtracking procedure. We show that the convergence of such methods extends beyond Lipschitzian setting and encompasses the case of local Hölder gradient continuity. Unlike conventional approaches that often resort to epsilon-oracles or linesearch procedures to address the absence of local Lipschitz continuity, we leverage plain Hölder inequalities without approximation or linesearch.
Our analysis establishes exact convergence results without the need for prior knowledge of local Hölder constants or the order of Hölder continuity.
Keywords
- Continuous Optimization
- Non-smooth Optimization
- Convex Optimization
Status: accepted
Back to the list of papers