EUROPT 2024
Abstract Submission

132. Investigating Variance Definitions for Stochastic Mirror Descent with Relative Smoothness

Invited abstract in session FD-2: Deterministic and stochastic optimization beyond Euclidean geometry, stream Advances in first-order optimization.

Friday, 14:10 - 15:50
Room: M:O

Authors (first author is the speaker)

1. Hadrien Hendrikx
Inria Grenoble

Abstract

Mirror Descent is a popular algorithm, that extends Gradients Descent (GD) beyond the Euclidean geometry. One of its benefits is to enable strong convergence guarantees through smooth-like analyses, even for objectives with exploding or vanishing curvature. This is achieved through the introduction of the notion of relative smoothness, which holds in many of the common use-cases of Mirror descent. While basic deterministic results extend well to the relative setting, most existing stochastic analyses require additional assumptions on the mirror, such as strong convexity (in the usual sense), to ensure bounded variance. In this talk, we will revisit Stochastic Mirror Descent (SMD) proofs in the (relatively-strongly-) convex and relatively-smooth setting, and introduce a new (less restrictive) definition of variance which can generally be bounded (globally) under mild regularity assumptions. We will then investigate this notion in more details, and show that it naturally leads to strong convergence guarantees for stochastic mirror descent. Finally, we will leverage this new analysis to obtain convergence guarantees for the Maximum Likelihood Estimator of a Gaussian with unknown mean and variance.

Keywords

Status: accepted


Back to the list of papers