503. Automatic recommendation of optimization methods through their worst-case complexity
Invited abstract in session MB-5: Optimization and machine learning I, stream Optimization for machine learning.
Monday, 10:30-12:30Room: B100/4013
Authors (first author is the speaker)
| 1. | Sofiane Tanji
|
| UCLouvain | |
| 2. | François Glineur
|
| ICTEAM/INMA & CORE, Université catholique de Louvain (UCLouvain) |
Abstract
Since the advent of modern computational mathematics, the literature on optimization algorithms has been ever-growing, and a wide range of methods are now available. Typically, these methods apply to specific templates of optimization problems and are accompanied by proofs of their convergence rates. Templates vary: an objective may be a single function or a sum of functions with different properties (e.g., smoothness, convexity, a computable proximal operator), and constraints may be linear, involve a feasible set with a projection operator, or be functional. Hence, for a given optimization problem, it is not always easy to identify which templates it can match, especially if one wants to consider equivalent reformulations of the problem. This makes the task of choosing an optimization method tedious.
We propose a general framework to recommend optimization methods for any oracle-based problem. Given a user-provided optimization problem, the framework: (1) computes equivalent reformulations; (2) checks which formulations match known templates; (3) retrieves applicable optimization methods; and (4) order methods given their certified worst-case performance.
Moreover, when algorithms or templates feature hyperparameters (such as stepsize or some amount of curvature shift), our approach will provide the best problem-dependent value for those hyperparameters according to their worst-case convergence rates.
Keywords
- Computer-aided algorithm analysis
- Complexity and efficiency of algorithms
- Optimization software
Status: accepted
Back to the list of papers