239. ACCELERATED BREGMAN DIVERGENCE OPTIMIZATION WITH SMART: AN INFORMATION GEOMETRIC POINT OF VIEW
Invited abstract in session FD-2: Deterministic and stochastic optimization beyond Euclidean geometry, stream Advances in first-order optimization.
Friday, 14:10 - 15:50Room: M:O
Authors (first author is the speaker)
| 1. | Stefania Petra
|
| Heidelberg University |
Abstract
We consider the minimization of the Kullback-Leibler divergence between a linear model Ax and a positive vector b across various convex domains, such as the positive orthant, n-dimensional box, and probability simplex. Our focus is on the SMART method, which employs efficient multiplicative updates and is an exponentiated gradient method. This method offers dual interpretations as a Bregman proximal gradient method and as a Riemannian gradient descent on the parameter manifold of a corresponding distribution within the exponential family. This duality allows us to establish connections, facilitating accelerated SMART iterates while seamlessly incorporating constraints. Furthermore, it enables the development of a multilevel method utilizing first-order Riemannian optimization. To validate the efficacy of the acceleration schemes, we present results from extensive numerical experiments on large-scale datasets
Keywords
- Analysis and engineering of optimization algorithms
- SS - Advances in Nonlinear Optimization and Applications
- Multilevel optimization
Status: accepted
Back to the list of papers