454. Cubic regularized Newton methods with stochastic Hessian evaluations and momentum-based variance reduction
Invited abstract in session WB-8: Theoretical advances in nonconvex optimization, stream Large scale optimization: methods and algorithms.
Wednesday, 10:30-12:30Room: B100/7007
Authors (first author is the speaker)
| 1. | Yiming Yang
|
| Mathematics, Linköping University |
Abstract
Second-order methods have received considerable attention in recent years, especially as many of them have been developed for efficiently achieving second-order stationary points in nonconvex optimization problems. However, second-order information, i.e., the Hessian matrix, could be computationally cost to evaluate for many real-world applications, which limits its practical utility. In this work, we introduce a new variant of the cubic regularized Newton method, which does not require exact evaluation of the Hessian but relaxes it to stochastic Hessian evaluations. This relaxation reduces the per-iteration costs of second-order methods and allows for greater flexibility in incorporating second-order information into optimization algorithm design. We establish the worst-case iteration complexity for our proposed cubic regularized Newton method with stochastic Hessian evaluations for finding approximate second-order stationary points. In addition, we show that, similar to the vanilla cubic regularized Newton method, our proposed variant can still achieve better complexity than first-order methods.
Keywords
- Complexity and efficiency of algorithms
- Second- and higher-order optimization
- Stochastic optimization
Status: accepted
Back to the list of papers