180. An optimal lower bound for smooth convex functions
Invited abstract in session FC-4: Large-scale optimization III, stream Large-scale optimization.
Friday, 11:25 - 12:40Room: M:M
Authors (first author is the speaker)
| 1. | Mihai I. Florea
|
| Mathematical Engineering, Université catholique de Louvain |
Abstract
First order methods endowed with global convergence guarantees operate using global lower bounds on the objective. The tightening of the bounds has been shown to lead to an increase in both the theoretical guarantees and practical performance. In this work, we define a global lower bound for smooth convex objectives that is optimal with respect to the collected oracle information. Using the machinery underlying the optimal bounds, we construct an Optimized Gradient Method with Memory possessing the best known convergence guarantees for its class of algorithms, even in terms of the proportionality constant. We additionally equip the method with an adaptive convergence guarantee adjustment procedure that is an effective replacement for line-search. Preliminary simulation results validate the theoretical properties of the proposed method.
Keywords
- Large- and Huge-scale optimization
- Global optimization
- Linear and nonlinear optimization
Status: accepted
Back to the list of papers