152. On Global Rates for Regularization Methods Based on Secant Derivative Approximations
Invited abstract in session WB-2: High-order and tensor methods, stream Nonsmooth and nonconvex optimization.
Wednesday, 10:30-12:30Room: B100/7011
Authors (first author is the speaker)
| 1. | Sadok Jerad
|
| Mathemtical Institute, University of Oxford | |
| 2. | Coralia Cartis
|
| Mathematical Institute, University of Oxford |
Abstract
An approximation framework for adaptive regularization methods is presented, in which approximations are allowed only for the pth-order tensor. Between each recalculation of the pth-order derivative approximation, a high-order secant equation can be used to update the pth-order tensor as proposed in (Welzel2024), or the approximation can be kept constant in a lazy manner. When refreshing the pth-order tensor approximation after m steps, an exact evaluation of the tensor or a finite differences can be used with an explicit discretization stepsize. For all these new introduced variants, we establish the standard complexity bound on the number of iterations of standard adaptive regularization methods. Results are also specified for quasi-Newton methods for $p=2$.
Keywords
- Second- and higher-order optimization
- Linear and nonlinear optimization
- Large-scale optimization
Status: accepted
Back to the list of papers