EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
2967. EXTRA-NEWTON: A First Approach to Noise-Adaptive Accelerated Second-Order Methods
Invited abstract in session WA-32: Adaptive and Polyak step-size methods, stream Advances in large scale nonlinear optimization.
Wednesday, 8:30-10:00Room: 41 (building: 303A)
Authors (first author is the speaker)
1. | Kimon Antonakopoulos
|
EPFL |
Abstract
This work proposes a universal and adaptive second-order method for minimizing second-order smooth, convex functions. Our algorithm achieves optimal convergence rate when the oracle feedback is stochastic with variance, and improves its speed of convergence when it is run with deterministic oracles, where $T$ is the number of iterations. Our method also interpolates these rates without knowing the nature of the oracle apriori, which is enabled by a parameter-free adaptive step-size that is oblivious to the knowledge of smoothness modulus, variance bounds and the diameter of the constrained set. Building on this machinery, we show that we may be able to extract asymptotically faster rates than the standard Nesterov's accelerated second order methods.
Keywords
- Convex Optimization
- Continuous Optimization
- Algorithms
Status: accepted
Back to the list of papers