View the program in our Progressive Web App
Program
The program page offers the following functionalities: you can browse the full program or a specific time slot in the schedule. On different places on the site, you have the possibility to add sessions to your own personalized program. You can always access it through the «My Program» link in the menu. Note that this feature is only available if you are logged in. You can also export your personal program as a calendar file to import in your agenda.
Due to heavy traffic on the site, we had to disable the buttons to add/remove sessions on this page to keep a reasonable response time. To add/remove a session, you must enter the session page (by clicking on the title). We are sorry for the inconvenience. Alternatively, we strongly encourage you to use our Progressive Web App (compatible with all devices).
Warning: the full program page may be heavy to load.
Popular sessions
Monday, 8:50-10:00
MA-01: Plenary 1
Stream: Plenaries
Room: B100/1001
Chair(s):
Selin Ahipasaoglu
-
Constrained Optimization via Frank-Wolfe Algorithms
Sebastian Pokutta
Tuesday, 16:20-17:30
TD-01: Plenary 3 (EUROPT Lecture)
Stream: Plenaries
Room: B100/1001
Chair(s):
Giancarlo Bigi, Immanuel Bomze
-
Contextual Stochastic Bilevel Optimization
Daniel Kuhn
Wednesday, 9:00-10:00
WA-01: Plenary 4
Stream: Plenaries
Room: B100/1001
Chair(s):
Alain Zemkoho
Tuesday, 9:00-10:00
TA-01: Plenary 2
Stream: Plenaries
Room: B100/1001
Chair(s):
Laura Palagi
Wednesday, 16:05-16:15
WD-01: Closing
Stream: Plenaries
Room: B100/1001
Chair(s):
Alain Zemkoho, Giancarlo Bigi, Selin Ahipasaoglu
Monday, 10:30-12:30
MB-01: Advances in Large-Scale Derivative-Free Optimization
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Francesco Rinaldi, Andrea Cristofari
-
Dimensionality reduction techniques for derivative free optimization
Coralia Cartis -
Solving 10,000-Dimensional Optimization Problems Using Inaccurate Function Values: An Old Algorithm
Zaikun Zhang -
On the computation of the cosine measure in high dimensions.
Scholar Sun -
A Novel Stochastic Derivative-Free Trust-Region Algorithm with Adaptive Sampling and Moving Ridge Functions
Benjamin Rees, Christine Currie, Vuong Phan
Wednesday, 10:30-12:30
WB-05: Recent advances in min-max optimization
Stream: Optimization for machine learning
Room: B100/4013
Chair(s):
Ali Kavis, Aryan Mokhtari
-
Steering Towards Success: Efficient Methods for Nonconvex-Nonconcave Minimax Problems
Pontus Giselsson, Anton Åkerman, Max Nilsson, Manu Upadhyaya, Sebastian Banert -
A Universally Optimal Primal-Dual Method for Minimizing Heterogeneous Compositions
Benjamin Grimmer -
Parameter-free second-order methods for min-max optimization
Ali Kavis, Ruichen Jiang, Qiujiang Jin, Sujay Sanghavi, Aryan Mokhtari
Monday, 16:30-18:30
MD-10: Interactions between optimization and machine learning
Stream: Zeroth and first-order optimization methods
Room: B100/8011
Chair(s):
Cesare Molinari, Silvia Villa
-
From learning to optimize to learning optimization algorithms
Camille Castera, Peter Ochs -
The Surprising Agreement Between Convex Optimization Theory and Learning-Rate Scheduling for Large Model Training
Fabian Schaipp, Alexander Hägele, Adrien Taylor, Umut Simsekli, Francis Bach -
Greedy learning to optimise with convergence guarantees
Patrick Fahy, Mohammad Golbabaee, Matthias J. Ehrhardt -
Learning from data via overparameterization
Cesare Molinari, Silvia Villa, Lorenzo Rosasco, Cristian Vega, Hippolyte Labarrière
Monday, 14:00-16:00
MC-01: Strategies to Improve Zeroth-Order Optimization Methods
Stream: Zeroth and first-order optimization methods
Room: B100/1001
Chair(s):
Francesco Rinaldi, Andrea Cristofari
-
Improving the robustness of zeroth-order optimization solvers
Stefan M. Wild -
Derivative free optimization with structured random directions
Silvia Villa, Marco Rando, Cheik Traoré, Cesare Molinari, Lorenzo Rosasco -
Enhancing finite-difference-based derivative-free optimization with machine learning
Geovani Grapiglia, Timothé Taminiau, Estelle Massart -
A derivative-free algorithm based on resilient positive spanning sets
Sébastien Kerleau, Clément Royer
Monday, 14:00-16:00
MC-05: Optimization and machine learning II
Stream: Optimization for machine learning
Room: B100/4013
Chair(s):
Laurent Condat
-
Stabilized Proximal-Point Methods for Federated Optimization
Xiaowen Jiang, Anton Rodomanov, Sebastian Stich -
A primal-dual algorithm for variational image reconstruction with learned convex regularizers
Hok Shing Wong -
Entropic Mirror Descent for Linear Systems: Polyak’s Stepsize and Implicit Bias
Alexander Posch, Yura Malitsky
Monday, 16:30-18:30
MD-06: Smoothing techniques for nonsmooth optimization
Stream: Nonsmooth and nonconvex optimization
Room: B100/7013
Chair(s):
Olivier Fercoq
-
Second-order proximal-gradient methods for avoiding nonsmooth strict saddle points
Alexander Bodard, Masoud Ahookhosh, Panagiotis Patrinos -
A Proximal Variable Smoothing for Nonsmooth Minimization of the Sum of Three Functions Including Weakly Convex Composite Function
Keita Kume, Isao Yamada -
Analyzing the speed of convergence in nonsmooth optimization via the Goldstein epsilon-subdifferential
Bennet Gebken -
Minimizing the smoothed gap to solve saddle point problems
Olivier Fercoq
Monday, 10:30-12:30
MB-08: Systematic and computer-aided analyses I: Analyses of proximal splittings methods & friends
Stream: Systematic and computer-aided analyses of optimization algorithms
Room: B100/7007
Chair(s):
Aymeric Dieuleveut
-
The Augmented Lagrangian Method for Infeasible Convex Optimization
Roland Andrews -
Difference-of-convex algorithm with weakly convex functions: improved splitting technique and equivalence with proximal gradient descent
Teodor Rotaru, Panagiotis Patrinos, François Glineur -
Forward-backward type splitting algorithms with minimal lifting I
Anton Åkerman, Emanuele Naldi, Enis Chenchene, Pontus Giselsson -
Forward-backward type splitting algorithms with minimal lifting II
Emanuele Naldi, Anton Åkerman, Enis Chenchene, Pontus Giselsson
Monday, 14:00-16:00
MC-08: Systematic and computer-aided analyses II: Systematic algorithmic design approaches
Stream: Systematic and computer-aided analyses of optimization algorithms
Room: B100/7007
Chair(s):
François Glineur
-
Exact Verification of First-Order Methods via Mixed-Integer Linear Programming
Vinit Ranjan, Jisun Park, Stefano Gualandi, Andrea Lodi, Bartolomeo Stellato -
Tight Analysis of Second-Order Optimization Methods via Interpolation of generalized Hessian Lipschitz Univariate Functions
François Glineur, Nizar Bousselmi, Julien Hendrickx, Anne Rubbens -
Forward-backward algorithms with deviations
Sebastian Banert
Tuesday, 10:30-12:30
TB-04: Stochastic and Deterministic Global Optimization
Stream: Global optimization
Room: B100/5013
Chair(s):
Dmitri Kvasov, Eligius M.T. Hendrix
-
Global Optimization Algorithm through High-Resolution Sampling
Daniel Cortild, Claire Delplancke, Nadia Oudjane, Juan Peypouquet -
ɛ-subdifferential methods for global DC optimization
Adil Bagirov -
On global optimization of some nonlinear problems involving disjunctive constraints
Sonia Cafieri, Marcel Mongeau, Sebastien Bourguignon, Gwenaël Samain -
Benchmarking tools for global optimization
Dmitri Kvasov, Yaroslav Sergeyev
Tuesday, 10:30-12:30
TB-05: Randomized Optimization algorithms I
Stream: Optimization for machine learning
Room: B100/4013
Chair(s):
Laurent Condat
-
Scalable Second-Order Optimization Algorithms for Minimizing Low-Rank Functions
Edward Tansley, Coralia Cartis, Zhen Shao -
Distributed Optimization with Communication Compression
Yuan Gao, Sebastian Stich -
Stochastic Gradient Descent without Variance Assumption: A Tight Lyapunov Analysis
Lucas Ketels, Daniel Cortild, Guillaume Garrigos, Juan Peypouquet -
Communication-Efficient Algorithms for Federated Learning and Weakly Coupled Games
Sebastian Stich, Ali Zindari, Parham Yazdkhasti, Anton Rodomanov, Tatjana Chavdarova
Wednesday, 10:30-12:30
WB-09: Variational Analysis III
Stream: Variational analysis: theory and algorithms
Room: B100/8013
Chair(s):
Francisco Javier Aragón Artacho
-
Geometry and the complexity of first-order methods for Lipschitz optimization
Adrian Lewis -
Inner approximations of convex sets and intersections of projectionally exposed cones
Vera Roshchina -
Nonlinear Separation of Coradiant Sets: Optimality Conditions for Approximate Solutions
Miguel Angel Melguizo Padial, Fernando García Castaño -
Nonsmooth optimization techniques for computing projected quasi-equilibria
Giancarlo Bigi
Wednesday, 14:00-16:00
WC-10: Computational Aspects in Multiobjective Optimization
Stream: Multiobjective and Vector Optimization
Room: B100/8011
Chair(s):
Felix Neussel
-
Stochastic approximation in convex multiobjective optimization
Elena Molho, Carlo Alberto De Bernardi, Enrico Miglierina, Jacopo Somaglia -
Benchmarking Nonlinear Multi-Objective Optimizers in Julia
Manuel Berkemeier -
Parametrized convex MINLP: Warm-starting with Outer Approximation for Sequence of MINLPs
Erik Tamm, Gabriele Eichfelder, Jan Kronqvist -
On image space transformations in multiobjective optimization
Felix Neussel, Oliver Stein
Tuesday, 10:30-12:30
TB-10: First order methods: new perspectives for machine learning
Stream: Large scale optimization: methods and algorithms
Room: B100/8011
Chair(s):
Cesare Molinari, Silvia Villa, Lorenzo Rosasco
-
Convergence Analysis of Nonlinear Parabolic PDE Models with Neural Network Terms Trained with Gradient Descent
Konstantin Riedl, Justin Sirignano, Konstantinos Spiliopoulos -
Randomized trust-region method for non-convex mimimization
Radu-Alexandru Dragomir -
Perspectives on the analysis and design of optimization algorithms: Lyapunov analyses and counter-examples
Adrien Taylor -
Accelerated Gradient Methods via Inertial Systems with Hessian-driven Damping
Juan Peypouquet
Tuesday, 10:30-12:30
TB-03: Theoretical and algorithmic advances in large scale nonlinear optimization and applications Part 1
Stream: Large scale optimization: methods and algorithms
Room: B100/4011
Chair(s):
Stefania Bellavia, Benedetta Morini
-
Fully stochastic trust-region methods with Barzilai-Borwein steplengths
Benedetta Morini, Mahsa Yousefi, Stefania Bellavia -
prunAdag: an adaptive pruning-aware gradient method
Giovanni Seraghiti, Margherita Porcelli, Philippe L. Toint -
An acceleration strategy for gradient methods in convex quadratic programming
Gerardo Toraldo, Serena Crisci, Anna De Magistris, Valentina De Simone -
Corrective Frank-Wolfe: Unifying and Extending Correction Steps
Jannis Halbey, Seta Rakotomandimby, Mathieu Besançon, Sebastian Pokutta
Wednesday, 14:00-16:00
WC-05: Recent Advances in Stochastic Optimization
Stream: Optimization for machine learning
Room: B100/4013
Chair(s):
Chuan He
-
Almost sure convergence rates for stochastic gradient methods
Simon Weissmann -
A Hessian-Aware Stochastic Differential Equation for Modelling SGD
Xiang Li -
Complexity guarantees for risk-neutral generalized Nash equilibrium problems
Meggie Marschner, Mathias Staudigl -
Stochastic first-order methods can leverage arbitrarily higher-order smoothness for acceleration
Chuan He