90. Scaled gradient methods under convexity and local smoothness
Invited abstract in session TD-5: Nonsmooth optimization algorithms, stream Nonsmooth and nonconvex optimization algorithms.
Thursday, 14:10 - 15:50Room: M:N
Authors (first author is the speaker)
| 1. | Susan Ghaderi
|
| ESAT, KU Leuven | |
| 2. | Yves Moreau
|
| KU Leuven | |
| 3. | Masoud Ahookhosh
|
| Department of Mathematics, University of Antwerp |
Abstract
This paper explores the unconstrained minimization of smooth convex functions with locally Lipschitz gradients, focusing on the Scaled Gradient Method (SGA) and its adaptive version, AdaSGA. We analyze their convergence, complexity under local smoothness, and effectiveness through numerical experiments. Our findings, particularly applied to Poisson linear inverse problems, showcase the potential of these methods in optimization and computational applications.
Keywords
- Convex and non-smooth optimization
Status: accepted
Back to the list of papers