594. A Universally Optimal Primal-Dual Method for Minimizing Heterogeneous Compositions
Invited abstract in session WB-5: Recent advances in min-max optimization, stream Optimization for machine learning.
Wednesday, 10:30-12:30Room: B100/4013
Authors (first author is the speaker)
| 1. | Benjamin Grimmer
|
| Applied Mathematics and Statistics, Johns Hopkins University |
Abstract
This talk will propose a universal, optimal algorithm for convex minimization problems of the composite form g0(x)+h(g1(x),…,gm(x))+u(x), which take a minimax form when h is dualized. We allow each gj to independently range from being nonsmooth Lipschitz to smooth, from convex to strongly convex, described by notions of Hölder continuous gradients and uniform convexity. Note that, although the objective is built from a heterogeneous combination of such structured components, it does not necessarily possess smoothness, Lipschitzness, or any favorable structure overall other than convexity. Regardless, we provide a universal optimal method in terms of oracle access to (sub)gradients of each gj. The key insight enabling our optimal universal analysis is the construction of two new constants, the Approximate Dualized Aggregate smoothness and strong convexity, which combine the benefits of each heterogeneous structure into single quantities amenable to analysis. As a key application, fixing h as the nonpositive indicator function, this model readily captures functionally constrained minimization g0(x)+u(x) subject to gj(x)≤0.
Keywords
- Non-smooth optimization
- First-order optimization
Status: accepted
Back to the list of papers