A variable metric stochastic aggregated gradient algorithm for convex optimization

Invited abstract in session TA-54: Large scale structured optimization 2, stream Convex Optimization.

Area: Continuous Optimization

Tuesday, 8:30-10:00
Room: Building PA, Room B

Authors (first author is the speaker)

1. Andreas Themelis
IMT School for Advanced Studies Lucca
2. Silvia Villa
DIMA, Universita' di Genova
3. Panagiotis Patrinos
Electrical Engineering, KU Leuven
4. Alberto Bemporad
IMT Institute for Advanced Studies Lucca

Abstract

In this talk we propose and study a novel variable metric stochastic gradient algorithm for minimizing the sum of a convex nonsmooth function and a large number of convex smooth functions. For problems of this kind, incremental gradient methods have proven very effective and have recently received much attention as they achieve the same theoretical convergence rates of full gradient methods, yet with computational costs per iteration independent of the number of components. Following this trend, we extend the theory behind SAGA [1], an incremental aggregated gradient method, by integrating the algorithm with a stochastic variable metric scheme. This opens the door to possibly account for the sensitivity to ill conditioning which characterizes first order methods, include curvature information, or exploit specific problem structures such as sparsity. We then propose a stochastic diagonal scaling in the fashion of AdaGrad [2] intended for sparse problems, and show the improvements of the chosen metric with numerical simulations.

Keywords

Status: accepted


Back to the list of papers