557. Stochastic first-order methods can leverage arbitrarily higher-order smoothness for acceleration
Invited abstract in session WC-5: Recent Advances in Stochastic Optimization, stream Optimization for machine learning.
Wednesday, 14:00-16:00Room: B100/4013
Authors (first author is the speaker)
| 1. | Chuan He
|
| Linköping University |
Abstract
Stochastic first-order optimization methods play a crucial role in modern artificial intelligence. From a theoretical perspective, worst-case sample complexity is an important measure of the computational cost of stochastic algorithms. In this talk, I will introduce a new stochastic first-order method with multi-extrapolated momentum to leverage the Lipschitz continuity of arbitrarily high-order derivatives for acceleration. Surprisingly, this work highlights that higher-order smoothness can play an important role in the analysis of stochastic first-order methods.
Keywords
- Stochastic optimization
- First-order optimization
- Complexity and efficiency of algorithms
Status: accepted
Back to the list of papers