EURO 2024 Copenhagen
Abstract Submission

EURO-Online login

277. Accelerating Randomized Adaptive Subspace Trust-Region Derivative-Free Algorithms

Invited abstract in session TC-32: Algorithms for machine learning and inverse problems: zeroth-order optimisation, stream Advances in large scale nonlinear optimization.

Tuesday, 12:30-14:00
Room: 41 (building: 303A)

Authors (first author is the speaker)

1. Stefan M. Wild
Applied Mathematics and Computational Research Division, Lawrence Berkeley National Laboratory
2. Kwassi Joseph Dzahini
Mathematics and Computer Science Division, Argonne National Laboratory
3. Xiaoqian Liu
The University of Texas MD Anderson Cancer Center

Abstract

This talk presents an algorithm for solving deterministic derivative-free optimization problems with thousands of decision variables. We employ a randomized dimension reduction technique based on Johnson-Lindenstrauss transforms and the trust-region framework of Cartis and Roberts. A key ingredient in our work is to extend the affine space within which the algorithm operates to include information learned in the course of the optimization. By adapting this space and judiciously using information gained in the course of the optimization, we obtain significant performance benefits on a class of large-scale test problems without sacrificing convergence guarantees.

Keywords

Status: accepted


Back to the list of papers