57. Solving 10,000-Dimensional Optimization Problems Using Inaccurate Function Values: An Old Algorithm
Invited abstract in session MB-35: Nonlinear Optimization Algorithms and Applications: 1 , stream Continuous and mixed-integer nonlinear programming: theory and algorithms.
Monday, 10:30-12:00Room: Michael Sadler LG15
Authors (first author is the speaker)
| 1. | Zaikun Zhang
|
| School of Mathematics, Sun Yat-sen University |
Abstract
We re-introduce a derivative-free subspace optimization framework originating from Chapter 5 the thesis [Z. Zhang, On Derivative-Free Optimization Methods, PhD thesis, Chinese Academy of Sciences, Beijing, 2012] of the author under the supervision of Ya-xiang Yuan. At each iteration, the framework defines a (low-dimensional) subspace based on an approximate gradient, and then solves a subproblem in this subspace to generate a new iterate. We sketch the global convergence and worst-case complexity analysis of the framework, elaborate on its implementation, and present some numerical results on solving problems with dimension as high as 10,000. The same framework was presented during ICCOPT 2013 in Lisbon under the title "A Derivative-Free Optimization Algorithm with Low-Dimensional Subspace Techniques for Large-Scale Problems", although it remains nearly unknown to the community until very recently. An algorithms following this framework named NEWUOAs was implemented by Zhang in MATLAB in 2011 (https://github.com/newuoas/newuoas), ported to Module-3 by Nystroem (Intel) in 2017, and included in cm3 in 2019 (https: //github.com/modula3/cm3/blob/master/caltech-other/newuoa/src/NewUOAs.m3).
Keywords
- Programming, Nonlinear
- Mathematical Programming
- Algorithms
Status: accepted
Back to the list of papers