249. Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning
Invited abstract in session MD-2: Optimization in machine Learning , stream Nonsmooth and nonconvex optimization.
Monday, 16:30-18:30Room: B100/7011
Authors (first author is the speaker)
| 1. | Karlo Palenzuela
|
| Computing Science, Umeå University | |
| 2. | Ali Dadras
|
| Mathematics and Mathematical Statistics, Umeå university | |
| 3. | Alp Yurtsever
|
| Umeå University |
Abstract
Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an $\epsilon$-suboptimal solution in $\mathcal{O}(1/\epsilon)$ communication rounds, requiring a total of $\mathcal{O}(1/\epsilon^2)$ stochastic subgradient oracle calls.
Keywords
- Non-smooth optimization
Status: accepted
Back to the list of papers