EUROPT 2025
Abstract Submission

357. Complexity guarantees for risk-neutral generalized Nash equilibrium problems

Invited abstract in session WC-5: Recent Advances in Stochastic Optimization, stream Optimization for machine learning.

Wednesday, 14:00-16:00
Room: B100/4013

Authors (first author is the speaker)

1. Meggie Marschner
Mathematical Optimization, Mannheim University
2. Mathias Staudigl
Department of Mathematics, Universität Mannheim

Abstract

In this paper we address stochastic generalized Nash equilibrium problem (SGNEP) seeking with risk-neutral agents. In this work, the stochastic variance-reduced gradient (SVRG) technique is modified to contend with general sample space a stochastic forward-backward-forward splitting scheme with variance reduction (DVRSFBF) is proposed for resolving structured monotone inclusion problems. In DVRSFBF, the mini-batch gradient estimator is computed periodically in the outer loop, while only cheap sampling is required in the frequently activated inner loop, thus achieving significant speedups when sampling costs cannot be overlooked. The algorithm is fully distributed and it guarantees almost sure convergence under appropriate batch size and strong monotonicity assumptions. Moreover, it exhibits a linear rate with possible biased estimators, which is rather mild and imposed in many simulation-based optimization schemes. A numerical study on a class of networked Cournot games reflects the performance of our proposed algorithm.

Keywords

Status: accepted


Back to the list of papers