View the program in our Progressive Web App
Program
The program page offers the following functionalities: you can browse the full program or a specific time slot in the schedule. On different places on the site, you have the possibility to add sessions to your own personalized program. You can always access it through the «My Program» link in the menu. Note that this feature is only available if you are logged in. You can also export your personal program as a calendar file to import in your agenda.
Due to heavy traffic on the site, we had to disable the buttons to add/remove sessions on this page to keep a reasonable response time. To add/remove a session, you must enter the session page (by clicking on the title). We are sorry for the inconvenience. Alternatively, we strongly encourage you to use our Progressive Web App (compatible with all devices).
Warning: the full program page may be heavy to load.
Popular sessions
Thursday, 16:20 - 17:20
TE-01: Plenary III - Gabriele Eichfelder EUROPT Fellow
Stream: Plenaries
Room: M:A
Chair(s):
Giancarlo Bigi, Pontus Giselsson, Oliver Stein
-
Multiobjective optimization, uncertainty, and a bit of set optimization
Gabriele Eichfelder
Thursday, 8:45 - 9:35
TA-01: Plenary II - Amir Beck
Stream: Plenaries
Room: M:A
Chair(s):
Giancarlo Bigi
Wednesday, 8:45 - 9:35
WB-01: Plenary I - Gabriel Peyré
Stream: Plenaries
Room: M:A
Chair(s):
Pontus Giselsson
-
Conservation laws for gradient flows
Gabriel Peyré
Friday, 8:45 - 9:35
FA-01: Plenary IV - Sebastian Stich
Stream: Plenaries
Room: M:A
Chair(s):
Alp Yurtsever
-
A Universal Framework for Federated (Convex) Optimization
Sebastian Stich
Wednesday, 10:05 - 11:20
WC-05: Optimization for learning I
Stream: Optimization for learning
Room: M:N
Chair(s):
Manu Upadhyaya
-
Incorporating History and Deviations in Forward-Backward Splitting
Pontus Giselsson -
Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique
TaeHo Yoon, Jaeyeon Kim, Jaewook Suh, Ernest Ryu -
Accelerated Algorithms For Nonlinear Matrix Decomposition With The Relu Function
Giovanni Seraghiti, Arnaud Vandaele, Margherita Porcelli, Nicolas Gillis
Wednesday, 8:30 - 8:45
WA-01: Opening session
Stream: Opening session
Room: M:A
Chair(s):
Pontus Giselsson
Wednesday, 11:25 - 12:40
WD-05: Optimization for learning II
Stream: Optimization for learning
Room: M:N
Chair(s):
Manu Upadhyaya
-
Compressed Gradient Descent with Matrix Stepsizes for Non-Convex Optimization
Hanmin Li, Avetik Karagulyan, Peter Richtarik -
Optimization flows landing on the Stiefel manifold: continuous-time flows, deterministic and stochastic algorithms
Bin Gao, P.-A. Absil -
Is maze-solving parallelizable?
Romain Cosson
Wednesday, 14:10 - 15:50
WE-02: Recent advances in computer-aided analyses of optimization algorithms I
Stream: Conic optimization: theory, algorithms and applications
Room: M:O
Chair(s):
Adrien Taylor, Manu Upadhyaya
-
Last-Iterate Convergence of Extragradient-based Methods
Eduard Gorbunov, Adrien Taylor, Samuel Horvath, Nicolas Loizou, Gauthier Gidel -
Automated tight Lyapunov analysis for first-order methods
Manu Upadhyaya, Sebastian Banert, Adrien Taylor, Pontus Giselsson -
Second-order interpolation conditions for univariate functions, towards a tight analysis of second-order optimization methods
Anne Rubbens, Nizar Bousselmi, Julien Hendrickx, François Glineur -
Analysis of Second-Order Methods via non-convex Performance Estimation
Nizar Bousselmi, Anne Rubbens, Julien Hendrickx, François Glineur
Wednesday, 11:25 - 12:40
WD-02: Conic and polynomial optimization
Stream: Conic optimization: theory, algorithms and applications
Room: M:O
Chair(s):
Immanuel Bomze
-
Uncertain standard quadratic optimization under distributional assumptions: a chance-constrained epigraphic approach
Immanuel Bomze, Daniel de Vicente -
On Tractable Convex Relaxations of Standard Quadratic Optimization Problems under Sparsity Constraints
Bo Peng, Immanuel Bomze, Yuzhou Qiu, E. Alper Yildirim -
New results for sparse conic reformulations
Markus Gabl
Thursday, 14:10 - 15:50
TD-02: Recent advances in computer-aided analyses of optimization algorithms III
Stream: Conic optimization: theory, algorithms and applications
Room: M:O
Chair(s):
Adrien Taylor, Manu Upadhyaya
-
Provable non-accelerations of the heavy-ball method
Aymeric Dieuleveut, Adrien Taylor, Baptiste Goujaud -
Provable non-accelerations of the heavy-ball method
Baptiste Goujaud, Adrien Taylor, Aymeric Dieuleveut -
Exact convergence rates of the last iterate in subgradient methods
François Glineur, Moslem Zamani -
Algorithms with learned deviations
Sebastian Banert
Wednesday, 14:10 - 15:50
WE-06: Higher-order Methods in Mathematical Programming I
Stream: Challenges in nonlinear programming
Room: M:H
Chair(s):
Mathias Staudigl
-
Spectral Preconditioning for Gradient Methods on Graded Non-convex Functions
Nikita Doikov -
Barrier Algorithms for Constrained Non-Convex Optimization
Pavel Dvurechensky, Mathias Staudigl -
Accelerated cubic regularized quasi-newton methods
Dmitry Kamzolov -
Relaxation Approaches for Nonlinear Sparse Optimization Problems
Steffensen Sonja
Wednesday, 16:20 - 18:00
WF-06: Stochastic Gradient Methods: Bridging Theory and Practice
Stream: Challenges in nonlinear programming
Room: M:H
Chair(s):
Simon Weissmann
-
Stochastic Optimization under Hidden Convexity
Ilyas Fatkhullin, Niao He, Yifan Hu -
On Almost Sure Convergence Rates for Stochastic Gradient Methods
Sara Klein -
Optimal sampling for stochastic and natural gradient descent
Robert Gruhlke, Philipp Trunschke, Anthony Nouy -
Stochastic gradient methods and tame geometry
Johannes Aspman, Jiri Nemecek, Vyacheslav Kungurtsev, Fabio V. Difonzo, Jakub Marecek
Wednesday, 16:20 - 18:00
WF-02: Recent advances in computer-aided analyses of optimization algorithms II
Stream: Conic optimization: theory, algorithms and applications
Room: M:O
Chair(s):
Adrien Taylor, Manu Upadhyaya
-
Exact worst-case convergence rates of gradient descent: a complete analysis for all constant stepsizes over nonconvex and convex functions
Teodor Rotaru, François Glineur, Panagiotis Patrinos -
A Linear-Quadratic Program for Estimating Performance of Convex Optimization Algorithm
Ashkan Panahi -
On the convergence rate of the difference-of-convex algorithm (DCA)
Hadi Abbaszadehpeivasti -
Non-expansiveness for frugal resolvent splitting methods, using PEP
Anton Åkerman, Emanuele Naldi, Enis Chenchene, Sebastian Banert, Pontus Giselsson
Friday, 10:05 - 11:20
FB-04: Large-scale optimization II
Stream: Large-scale optimization
Room: M:M
Chair(s):
Anton Åkerman
-
An interior proximal gradient method for nonconvex optimization
Alberto De Marchi, Andreas Themelis -
Krasnoselskii-Mann Iterations: Inertia, Perturbations and Approximation
Daniel Cortild, Juan Peypouquet -
A Fast Optimistic Method for Monotone Variational Inequalities
Michael Sedlmayer, Dang-Khoa Nguyen, Radu Ioan Bot
Friday, 14:10 - 15:50
FD-06: Difference and decomposition methods
Stream: Methods for non-/monotone inclusions and their applications
Room: M:H
Chair(s):
Dânâ Davar
-
Douglas-Rachford DC methods for generalized DC programming
AVINASH DIXIT -
Revisiting Frank-Wolfe for Nonconvex Problems
Hoomaan Maskan, Suvrit Sra, Alp Yurtsever -
Penalty Decomposition methods for convex and nonconvex market equilibrium models
Giulio Scarponi, Marco Sciandrone -
A derivative-free trust-region method based on finite differences for composite nonsmooth optimization
Dânâ Davar, Geovani Grapiglia
Thursday, 14:10 - 15:50
TD-05: Nonsmooth optimization algorithms
Stream: Nonsmooth and nonconvex optimization algorithms
Room: M:N
Chair(s):
Susan Ghaderi
-
A feasible directions method for nonconvex optimization over linear constraints with a nonsmooth concave regularizer
Nadav Hallak, Amir Beck -
Spectral and nuclear norms of order three tensors: Complexity and computation
Zhening Li -
High-order Moreau envelope in nonsmooth convex setting: L-smoothness and inexact gradient method
Alireza Kabgani, Masoud Ahookhosh -
Scaled gradient methods under convexity and local smoothness
Susan Ghaderi, Yves Moreau, Masoud Ahookhosh
Friday, 14:10 - 15:50
FD-02: Deterministic and stochastic optimization beyond Euclidean geometry
Stream: Advances in first-order optimization
Room: M:O
Chair(s):
Adrien Taylor, Hadrien Hendrikx
-
Horospherically Convex Optimization on Hadamard Manifolds
Christopher Criscitiello, Jungbin Kim -
Implicit Regularisation of Mirror Flow on Separable Classification Problems
Radu-Alexandru Dragomir -
ACCELERATED BREGMAN DIVERGENCE OPTIMIZATION WITH SMART: AN INFORMATION GEOMETRIC POINT OF VIEW
Stefania Petra -
Investigating Variance Definitions for Stochastic Mirror Descent with Relative Smoothness
Hadrien Hendrikx
Wednesday, 10:05 - 11:20
WC-02: Conic and Semidefinite Optimization
Stream: Conic optimization: theory, algorithms and applications
Room: M:O
Chair(s):
Miguel Anjos
-
Learning to Relax Nonconvex Quadratically Constrained Quadratic Programs
Burak Kocuk, Buket Özen -
Beyond Traditional PCA: The Two-Step-SDP Algorithm for Data Analysis
Eloisa Macedo -
Semidefinite liftings for the complex cut polytope
Miguel Anjos, Lennart Sinjorgo, Renata Sotirov
Thursday, 10:05 - 11:20
TB-05: Optimization for learning III
Stream: Optimization for learning
Room: M:N
Chair(s):
Max Nilsson
-
MAST: Model-Agnostic Sparsified Training
Egor Shulgin, Peter Richtarik -
Online Learning and Information Exponents: The Importance of Batch size & Time/Complexity Tradeoffs
Stephan Ludovic -
Compressed and distributed least-squares regression: convergence rates with applications to Federated Learning
Constantin Philippenko, Aymeric Dieuleveut
Thursday, 10:05 - 11:20
TB-03: In memory of Georg Still - part 1
Stream: In memory of Georg Still
Room: M:J
Chair(s):
Oliver Stein
-
On the weakest constraint qualification for sharp local minimizers
Oliver Stein, Maximilian Volk -
Copositive and semi-infinite optimization
Mirjam Duer -
A reflection on the work of Georg Still on semi-infinite linear optimization
Etienne De Klerk