493. Tight analyses of first-order methods with error feedback
Invited abstract in session MD-8: Systematic and computer-aided analyses III: noisy gradient methods and fixed-point algorithms, stream Systematic and computer-aided analyses of optimization algorithms.
Monday, 16:30-18:30Room: B100/7007
Authors (first author is the speaker)
| 1. | Daniel Berg Thomsen
|
| INRIA |
Abstract
Error feedback (EF), including newer variants such as EF21, are used in distributed optimization to reduce communication bandwidth. Recently, methods using error feedback were proposed and analyzed in different settings. E.g. in single and multi-worker settings, with or without stochastic gradient oracles, etc. There are many different claims regarding both the theoretical and practical performance of these methods in different contexts. In this talk, we intend to provide a clearer picture of the pros and cons of some of these methods. We will first focus on the single-worker setting using deterministic gradients, as it presents a simpler problem statement with fewer possible variations. This allows us to get a better understanding of the performance of these methods and to meaningfully compare them with one another. To do so, we use an automated approach for identifying simple yet optimal Lyapunov functions based on the performance estimation problem (PEP) framework, along with functions achieving those bounds.
Keywords
- Complexity and efficiency of algorithms
- Computer-aided algorithm analysis
- Distributed optimization
Status: accepted
Back to the list of papers