EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
1015. Stochastic derivative-free optimization algorithms using random subspace strategies
Invited abstract in session TC-32: Algorithms for machine learning and inverse problems: zeroth-order optimisation, stream Advances in large scale nonlinear optimization.
Tuesday, 12:30-14:00Room: 41 (building: 303A)
Authors (first author is the speaker)
1. | Kwassi Joseph Dzahini
|
Mathematics and Computer Science Division, Argonne National Laboratory | |
2. | Stefan M. Wild
|
Applied Mathematics and Computational Research Division, Lawrence Berkeley National Laboratory |
Abstract
We propose trust-region and direct-search frameworks for large-scale stochastic derivative-free optimization by introducing the STARS and StoDARS algorithms. While STARS achieves scalability by minimizing random models that approximate the objective in low-dimensional affine subspaces thus significantly reducing per-iteration costs in terms of function evaluations, these goals are achieved by StoDARS through the exploration of the decision space by means of poll directions generated in random subspaces. These subspaces and their dimension are chosen via Johnson--Lindenstrauss transforms such as those obtained from Haar-distributed orthogonal random matrices. The quality of the subspaces and sets of poll directions, as well as the accuracies of estimates and models used by the algorithms are required to hold with sufficiently high, but fixed, probabilities. Convergence and complexity results are obtained for both methods using martingale theory. In particular, by leveraging the ability of StoDARS to generate a dense set of poll directions, its almost sure convergence to Clarke stationary points is established. Moreover, the analysis of second-order behavior of the well-known mesh adaptive direct-search algorithms using a second-order-like extension of the Rademacher's theorem-based definition of the Clarke subdifferential (so-called generalized Hessian) is extended to the StoDARS framework, making it the first in a stochastic direct-search setting to the best of our knowledge.
Keywords
- Stochastic Optimization
- Large Scale Optimization
- Algorithms
Status: accepted
Back to the list of papers