169. Gradient Descent for Noisy Optimization
Invited abstract in session TC-6: Stochastic methods and applications, stream Methods for non-/monotone inclusions and their applications.
Thursday, 11:25 - 12:40Room: M:H
Authors (first author is the speaker)
| 1. | Feifei Hu
|
| Department of Mathematics, University of Bristol |
Abstract
We study the use of gradient descent with backtracking line search (GD-BLS) to solve the noisy optimization problem, assuming that the objective function is strictly convex. Without making any global assumptions on the objective function and without assuming that the variance of the stochastic gradient is bounded, we show that GD-BLS allows to estimate the minimiser of the objective function with an error bounded by some value proportional to the computational budget to the power of -0.25. We show that we can improve this rate by stopping the optimization process earlier when the gradient of the objective function is sufficiently close to zero and then using the residual computational budget to optimize with GD-BLS a finer approximation of the function. We also show a more general convergence result for GD-BLS under some certain assumption. Beyond knowing one parameter, achieving the aforementioned convergence rates do not require to tune the algorithms parameters according to the specific objective functions at hand, and we exhibit a simple noisy optimization problem for which stochastic gradient is not guaranteed to converge while the algorithms discussed in this work are.
Keywords
- Global optimization
- Complexity and efficiency of optimization algorithms
- Large- and Huge-scale optimization
Status: accepted
Back to the list of papers