EUROPT Summer School 2023

School dates, venue and topics

The school will take place on 21-22 August at the Corvinus Center for Operations Research of the Corvinus University of Budapest, Hungary. The school is planned to be in-person only, no live streaming will be provided. There will be two courses focusing on two approaches to solve continuous optimization problems, specifically on

1. Derivative-free optimization

 2. Fixed point algorithms

The lectures on derivative-free optimization will be delivered by Coralia Cartis, on fixed point algorithms by Russell Luke. Each day includes 6 hours of lectures plus coffee breaks. The exact timetable will be announced soon.

Attendance 

Attendance is free of charge but with a mandatory registration. Lectures are particularly suited for PhD students and young researchers to provide them with the chance of  attending two high level courses on continuous optimization, but  the school is open to everyone wishing to participate.

To register fill this form by 30 June (please, register only when you are sure you are coming: the available classroom is large, so everyone should have a place – in the very unlikely event of too many registrations  the priority will not be set upon early booking).

Timetable 

Monday 21 August – Russell Luke
Tuesday 22 August – Coralia Cartis

  9:00 – 10:30 Lecture I
10:30 – 11:00 Coffee break
11:00 – 12:30 Lecture II
12:30 – 14:00 Lunch break
14:00 – 15:30 Lecture III

The courses

Derivative-free optimization: problems, algorithms and applications

by  Coralia Cartis

Real-life applications often require the optimization of nonlinear functions with several unknowns or parameters – where the function is the result of highly expensive and complex model simulations involving noisy data (such as climate or financial models, chemical experiments), or the output of a black-box or legacy code, that prevent the numerical analyst from looking inside to find out or calculate problem information such as derivatives. Thus classical optimization algorithms, that use derivatives (steepest descent, Newton’s methods) often fail or are entirely inapplicable in this context. Efficient derivative-free optimization algorithms have been developed in the last twenty years in response to these imperative practical requirements. As even approximate derivatives may be unavailable, these methods must explore the landscape differently and more creatively. In state of the art techniques, clouds of points are generated judiciously and sporadically updated to capture local geometries as inexpensively as possible; sometimes, local function models around these points are built using techniques from approximation theory and carefully optimised over a local neighbourhood (a trust region) to give a better solution estimate.

This course will describe the remit and use of derivative-free optimization paradigms, survey the main classes of algorithms and their key constructions and properties, as well as recent developments in the subclass of model-based trust region methods. For the latter, we will focus on improvements in robustness of these methods in the presence of noisy evaluations, scalable subspace variants and simplified approaches for the ubiquitous data fitting/nonlinear least-squares problems. Parameter estimation for climate modelling will provide a useful application for these techniques. Time permitting, we will contrast derivative-free optimization techniques with global optimization approaches, discussing numerical costs and benefits.

 

Quantitative convergence analysis for fixed point algorithms in continuous optimization

by Russell Luke

Quantifying the complexity of algorithms in continuous optimization is often carried out on a case-by-case basis, but the general strategy of the arguments is always the same. For those algorithms that can be formulated as fixed point iterations (and a great number of them can), the analysis boils down to two central properties which can be defined with such generality that the analytical arguments can be lifted to a huge variety of settings and spaces. In this short course I will try to demonstrate three main tasks in carrying out this kind of analysis. The first task is how to formulate an algorithm for continuous optimization as a fixed point iteration. Secondly, I will introduce almost alpha-firm mappings and show how to recognize that your algorithm has this property. Knowing that the algorithm is alpha firmly nonexpansive is sufficient to show convergence of residuals, even with rates, but convergence of the iterates cannot be determined by this property alone. Finally I will introduce metric subregularity and its many manifestations, and I will show how this is used to arrive at complexity estimates for the iterates of algorithms. As time allows, I will illustrate the whole process on (nonconvex) applications in the usual finite dimensional vector spaces, and convex applications in nonlinear metric spaces and in probability spaces.

Lecture notes and slides are available here

The lecturers

Coralia Cartis (University of Oxford)

Coralia Cartis is Professor of Numerical Optimization at the Mathematical Institute, University of Oxford and Turing Fellow at the Alan Turing Institute for Data Science. She received a BSc degree in mathematics from Babeș-Bolyai University, Cluj-Napoca, Romania, and a PhD degree in mathematics from the University of Cambridge, under the supervision of Prof Michael J.D. Powell. Prior to her current post, she worked as a research scientist at Rutherford Appleton Laboratory and as a tenured assistant professor in the School of Mathematics, University of Edinburgh. Her research interests include the development and analysis of nonlinear optimization algorithms and diverse applications of optimization from climate modeling to signal processing and machine learning. She serves on the editorial boards of leading optimization and numerical analysis journals and was awarded some prizes for her research. In particular, she has been included in the class of SIAM 2023 Fellows and elected EUROPT Fellow in 2023.

Russell Luke (Universität Göttingen)

Russell Luke grew up outside of Granville, Ohio (near Columbus) and attended college at the University of California, Berkeley, where he graduated with honors in Applied Mathematics in 1991 (thesis advisor Hans Bremermann). After college, he took a break from an academic career, part spent making documentary films – “The Ride to Wounded Knee” (1992, post-production manager, assistant editor, sound editor), “29 and 7 Strong” (1995, all but narration) – and part spent as a VISTA volunteer building Self-Help homes in Okanagon County, Washington (grant writer, Spanish translator, real-estate purchaser, and “documentarist”). He returned to Applied Mathematics at the University of Washington in Seattle, where he received a MSc in December of 1997, and PhD in June of 2001 under the guidance of James Burke. He was a NASA/GSFC Graduate Student Research Fellow from 1998 to 2001.  This experience grew into the central application of his PhD thesis on the theory and practice of numerical algorithms for adaptive optics to be used with the James Webb Space Telescope, Hubble’s replacement. After graduation he moved to the University of Göttingen in Germany to join Roland Potthast and Rainer Kress’ group at the Institute for Numerical and Applied Mathematics. There he worked on inverse scattering theory and research cooperation with industry partners (July 2001 to April 2003). He was with the Mathematics Department at Simon Fraser University near Vancouver Canada as a PIMS Fellow under Jonathan Borwein and Adrian Lewis from December 2002 until August of 2004. He joined the University of Delaware in 2004 and earned promotion and tenure in 2009 before moving back to Göttingen. He is currently Professor for Continuous Optimization and Variational Analysis at the Universität Göttingen. He is a member of the Society for Industrial and Applied Mathematics (SIAM), Deutsche Mathematiker-Vereinigung (DMV), Gesellschaft für Operations Research (GOR), Area Editor for the Open Journal of Mathematical Optimization, Associate Editor of Journal of Optimization Theory and Applications (JOTA)ESAIM: Control, Optimization and Calculus of Variations (COCV)SIAM Journal on OptimizationAdvances in Computational Mathematics, and the IMA Journal of Numerical Analysis.

 

Organisation