EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
1907. Stochastic Mirror Descent for Convex Optimization with Consensus Constraints
Invited abstract in session MD-34: Preconditioning for Large Scale Nonlinear Optimization, stream Advances in large scale nonlinear optimization.
Monday, 14:30-16:00Room: 43 (building: 303A)
Authors (first author is the speaker)
1. | Panos Parpas
|
Computing, Imperial College London |
Abstract
The mirror descent algorithm is known to be effective in situations where it is beneficial to adapt the mirror map to the underlying geometry of the optimization model.
However, the effect of mirror maps on the geometry of distributed optimization problems has not been previously addressed. In this paper we study an exact distributed mirror descent algorithm in continuous-time under additive noise.
We establish a linear convergence rate of the proposed dynamics for the setting of convex optimization.
Our analysis draws motivation from the Augmented Lagrangian and its relation to gradient tracking.
To further explore the benefits of mirror maps in a distributed setting we present a preconditioned variant of our algorithm with an additional mirror map over the Lagrangian dual variables. This allows our method to adapt to both the geometry of the primal variables, as well as to the geometry of the consensus constraint.
We also propose a Gauss-Seidel type discretization scheme for the proposed method and establish its linear convergence rate.
For certain classes of problems we identify mirror maps that mitigate the effect of the graph's spectral properties on the convergence rate of the algorithm.
Using numerical experiments we demonstrate the efficiency of the methodology on convex models, both with and without constraints. Our findings show that the proposed method outperforms other methods, especially in scenarios where the model's geometry is not cap
Keywords
- Continuous Optimization
Status: accepted
Back to the list of papers