170. A divergence-based condition to ensure quantile improvement in black-box global optimization
Invited abstract in session WE-4: Large-scale optimization I, stream Large-scale optimization.
Wednesday, 14:10 - 15:50Room: M:M
Authors (first author is the speaker)
| 1. | Thomas Guilmeau
|
| Université Paris-Saclay, CentraleSupélec, INRIA | |
| 2. | Emilie Chouzenoux
|
| Université Paris-Est Marne-La-Vallée | |
| 3. | Víctor Elvira
|
| School of Mathematics, University of Edinburgh |
Abstract
Black-box global optimization aims at seeking for the minimizers of an objective function whose analytical form is not known. To do so, many state-of-the-art methods rely on sampling-based strategies, where sampling distributions are built in an iterative fashion, so that their mass concentrate where the objective function is low. Despite empirical success, the convergence of these methods remains difficult to show theoretically. In this work, we introduce a new framework, based on divergence-decrease conditions, to study and design black-box global optimization algorithms. We show that the information-geometric optimization approach fits within our framework, which yields a new proof for its convergence analysis. We also establish a quantile improvement result for two novel algorithms, one related with the cross-entropy approach with mixture models, and another using heavy-tailed sampling distributions.
Keywords
- Derivative-free optimization
- Global optimization
- Nature inspired methods and algorithms
Status: accepted
Back to the list of papers