EUROPT 2025
Abstract Submission

403. Anderson Acceleration for Primal-Dual Hybrid Gradient

Invited abstract in session WC-3: Acceleration Methods in Optimization, stream Large scale optimization: methods and algorithms.

Wednesday, 14:00-16:00
Room: B100/4011

Authors (first author is the speaker)

1. yingxin zhou
2. Stefano Cipolla
School of Mathematical Sciences, University of Southampton
3. Vuong Phan
University of Southampton

Abstract

The use of first-order optimization methods has gained increasing attention for solving large-scale linear programming (LP) problems due to their low iteration computational cost. However, a drawback of such methods is their relatively slow convergence when high-accuracy solutions are required. To overcome this disadvantage, in this talk, we consider solving large-scale LP problems using the first-order method known as Primal-Dual Hybrid Gradient (PDHG), combined with an acceleration technique. Specifically, we formulate the PDHG method as a fixed-point iteration and apply Anderson Acceleration (AA) method to enhance its convergence speed. Unlike standard fixed-point schemes, we leverage the projection operator to ensure that the iterates generated by AA remain within the feasible domain of the operator, thereby avoiding the issue where the next iterate may violate the constraints of the LP problem. Under the assumption that the fixed-point operator is non-expansive, we establish a global convergence guarantee for the proposed algorithm. Numerical results will showcase how using of Anderson Acceleration in this framework allows to consistently improve the robustness and efficiency of the baseline PDHG. This talk is based on Y. Zhou, S. Cipolla, V. Phan. “Anderson Acceleration for Primal-Dual Hybrid Gradient.”

Keywords

Status: accepted


Back to the list of papers