571. Corrective Frank-Wolfe: Unifying and Extending Correction Steps
Invited abstract in session TB-3: Theoretical and algorithmic advances in large scale nonlinear optimization and applications Part 1, stream Large scale optimization: methods and algorithms.
Tuesday, 10:30-12:30Room: B100/4011
Authors (first author is the speaker)
| 1. | Jannis Halbey
|
| 2. | Seta Rakotomandimby
|
| ENPC | |
| 3. | Mathieu Besançon
|
| Zuse Institute Berlin | |
| 4. | Sebastian Pokutta
|
| ZIB/TUB |
Abstract
Fully-Corrective Frank-Wolfe (FCFW) is a powerful extension of the classical Frank-Wolfe algorithm, offering accelerated convergence by re-optimizing over the convex hull of previously selected atoms.
Despite its strong theoretical guarantees, FCFW is rarely used in practice, as its correction step often requires solving a subproblem nearly as hard as the original one.
Several conditional gradient-type methods with more light-weight corrective steps on the active set have since emerged.
In this work, we propose Corrective Frank-Wolfe, a unifying framework for these methods that provides a general proof scheme establishing linear convergence for strongly convex objectives over polytopes.
Furthermore, we introduce two corrective steps that leverage second-order information, yielding significant practical speedups at the cost of solving a linear system or a linear program, respectively.
Computational experiments demonstrate substantial improvements, particularly on quadratic objectives.
Notably, our approach mitigates a key practical bottleneck of the theoretically optimal Conditional Gradient Sliding (CGS) method, which heavily depends on solving a projection subproblem.
Keywords
- First-order optimization
Status: accepted
Back to the list of papers