EUROPT 2025
Abstract Submission

132. A low-rank augmented Lagrangian method for doubly nonnegative relaxations of mixed-binary quadratic programs

Invited abstract in session WB-2: High-order and tensor methods, stream Nonsmooth and nonconvex optimization.

Wednesday, 10:30-12:30
Room: B100/7011

Authors (first author is the speaker)

1. Tianyun Tang
Institute of Operations Research and Analytics, National University of Singapore

Abstract

Doubly nonnegative (DNN) programming problems are known to be challenging to solve because of their huge number of Ω(n2) constraints and Ω(n2) variables. In this work, we introduce RNNAL, a method for solving DNN relaxations of large-scale mixed-binary quadratic programs by leveraging their solutions' possible low-rank property. RNNAL is a globally convergent Riemannian augmented Lagrangian method (ALM) that penalizes the nonnegativity and complementarity constraints while preserving all other constraints as an algebraic variety. After applying the low-rank decomposition to the ALM subproblem, its feasible region becomes an algebraic variety with favorable geometric properties. Our low-rank decomposition model is different from the standard Burer-Monteiro (BM) decomposition model in that we make the key improvement to equivalently reformulate most of the quadratic constraints after the BM decomposition into fewer and more manageable affine constraints. This modification is also important in helping us to alleviate the violation of Slater's condition for the primal DNN problem. Moreover, we make the crucial step to show that the metric projection onto the algebraic variety, although non-convex, can be transformed into a solvable convex optimization problem under certain regularity conditions, which can be ensured by a constraint-relaxation strategy. Numerous numerical experiments are conducted to validate the efficiency of the proposed RNNAL method.

Keywords

Status: accepted


Back to the list of papers