511. Continuized Nesterov Acceleration to improve convergence speed in non convex optimization
Invited abstract in session WB-8: Theoretical advances in nonconvex optimization, stream Large scale optimization: methods and algorithms.
Wednesday, 10:30-12:30Room: B100/7007
Authors (first author is the speaker)
| 1. | Julien Hermant
|
| 2. | Jean-François Aujol
|
| IMB, Université de Bordeaux | |
| 3. | Charles Dossal
|
| Insa Toulouse | |
| 4. | Aude Rondepierre
|
| Département Génie Mathématiques et Modélisation, INSA Toulouse |
Abstract
In the realm of smooth and convex functions, it is well known that in many scenarios, the Nesterov Accelerated Gradient (NAG) algorithm converges to the minimum significantly faster than Gradient Descent. Dropping the convexity assumption, this statement of convergence acceleration is challenged. We show that a variant, the continuized version of (NAG), introduced in [1], offers the opportunity to achieve new convergence results in settings where non convexity hinders the traditional NAG algorithm.
[1] E.Even,R.Berthier el al., "A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip"
Keywords
- First-order optimization
- Stochastic optimization
Status: accepted
Back to the list of papers