View the program in our Progressive Web App
Program for stream Zeroth and first-order optimization methods
Monday
Monday, 10:30-12:30
MB-01: Advances in Large-Scale Derivative-Free Optimization
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Francesco Rinaldi, Andrea Cristofari
-
Dimensionality reduction techniques for derivative free optimization
Coralia Cartis -
Solving 10,000-Dimensional Optimization Problems Using Inaccurate Function Values: An Old Algorithm
Zaikun Zhang -
On the computation of the cosine measure in high dimensions.
Scholar Sun -
A Novel Stochastic Derivative-Free Trust-Region Algorithm with Adaptive Sampling and Moving Ridge Functions
Benjamin Rees, Christine Currie, Vuong Phan
Monday, 14:00-16:00
MC-01: Strategies to Improve Zeroth-Order Optimization Methods
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Francesco Rinaldi, Andrea Cristofari
-
Improving the robustness of zeroth-order optimization solvers
Stefan M. Wild -
Derivative free optimization with structured random directions
Silvia Villa, Marco Rando, Cheik Traoré, Cesare Molinari, Lorenzo Rosasco -
Enhancing finite-difference-based derivative-free optimization with machine learning
Geovani Grapiglia, Timothé Taminiau, Estelle Massart -
A derivative-free algorithm based on resilient positive spanning sets
Sébastien Kerleau, Clément Royer
Monday, 16:30-18:30
MD-01: Derivative-Free Optimization Methods for challenging applications: Handling Nonsmoothness and Constraints
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Francesco Rinaldi, Andrea Cristofari
-
Derivative-free Penalty-IPM for nonsmooth constrained optimization
Andrea Brilli, Youssef Diouane, Sébastien Le Digabel, Giampaolo Liuzzi, Christophe Tribes -
Complexity of a Riemannian Direct-search algorithm
Bastien Cavarretta, Clément Royer, Florian Yger, Florentin Goyens -
TRFD: A derivative-free trust-region method based on finite differences for composite nonsmooth optimization
Dânâ Davar, Geovani Grapiglia -
On the Pareto-efficient points of Simple Constrained Multiobjective Problems
Dimo Brockhoff
MD-10: Interactions between optimization and machine learning
Stream: Zeroth and first-order optimization methods
Room: B100/8011
Chair(s):
Cesare Molinari, Silvia Villa
-
From learning to optimize to learning optimization algorithms
Camille Castera, Peter Ochs -
The Surprising Agreement Between Convex Optimization Theory and Learning-Rate Scheduling for Large Model Training
Fabian Schaipp, Alexander Hägele, Adrien Taylor, Umut Simsekli, Francis Bach -
Greedy learning to optimise with convergence guarantees
Patrick Fahy, Mohammad Golbabaee, Matthias J. Ehrhardt -
Learning from data via overparameterization
Cesare Molinari, Silvia Villa, Lorenzo Rosasco, Cristian Vega, Hippolyte Labarrière
Tuesday
Tuesday, 10:30-12:30
TB-01: Zeroth-Order Optimization Methods for Stochastic and Noisy Problems
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Francesco Rinaldi, Andrea Cristofari
-
Unifying Trust-region Algorithms with Adaptive Sampling for Nonconvex Stochastic Optimization
Sara Shashaani, Yunsoo Ha -
Direct Search Methods for Stochastic Zeroth-Order Problems
Francesco Rinaldi, Andrea Cristofari -
Analysis of derivative-free algorithms on noisy problems
Alexandre Chotard, Anne Auger -
Derivative-Free Constrained Optimization in Hydraulics: Augmented Lagrangian Method with BOBYQA
Fabio Fortunato Filho, José Mario Martínez
Tuesday, 14:00-16:00
TC-01: First-Order Methods for Structured Optimization and Sampling
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Cesare Molinari, Silvia Villa
-
Constrained sampling with Primal-Dual Langevin Monte Carlo
Luiz Chamon -
A Fast Extra-Gradient Method with Flexible Anchoring
Enis Chenchene, Radu Ioan Bot -
Risk-averse guarantees for stochastic min-max problems
Yassine Laguel, Mert Gürbüzbalaban, Necdet Serhat Aybat, Yasa Syed
Wednesday
Wednesday, 10:30-12:30
WB-01: Advances in stochastic and non-euclidean first order methods
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Cesare Molinari, Silvia Villa
-
Non-monotone stochastic line search without overhead for training neural networks
Andrea Cristofari, Leonardo Galli, Stefano Lucidi -
On the convergence of stochastic Bregman proximal gradient algorithms with biased gradient estimators
Thomas Guilmeau, Emilie Chouzenoux, Víctor Elvira -
Riemannian gradient descent improves parameter-efficient fine-tuning
Bingcong Li -
General Tail Bounds for Non-Smooth Stochastic Mirror Descent
Andrea Paudice, Khaled Eldowa
Wednesday, 14:00-16:00
WC-01: Advances in Multiobjective and Bilevel Optimization without Derivatives
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Francesco Rinaldi, Andrea Cristofari
-
A Direct Multisearch Approach (DMS) for Many-Objective Derivative-Free Optimization
Everton Silva, Ana Luisa Custodio -
Exploring Polynomial Models in the Search Step of Direct Multisearch
Marta Pozzi, Everton Silva, Ana Luisa Custodio -
Derivative-Free Bilevel Optimization with Inexact Lower-Level Solutions
Edoardo Cesaroni, Giampaolo Liuzzi, Stefano Lucidi