EUROPT 2025
Abstract Submission

EURO-Online login

253. Second-order proximal-gradient methods for avoiding nonsmooth strict saddle points

Invited abstract in session MD-1: Smoothing techniques for nonsmooth optimization, stream Nonsmooth and nonconvex optimization.

Monday, 16:30-18:30

Authors (first author is the speaker)

1. Alexander Bodard
Department of Electrical Engineering, KU Leuven
2. Masoud Ahookhosh
Department of Mathematics, University of Antwerp
3. Panagiotis Patrinos
Electrical Engineering, KU Leuven

Abstract

This work introduces two proximal-gradient-based algorithms -- a trust-region method and a curvilinear linesearch method -- for minimizing the sum of a smooth function f and a nonsmooth function g. We prove that these methods converge to second-order stationary points of the objective whenever the proximal mapping of g is locally of class C1 around around the forward points of the limit points. If g belongs to the broad class of C2-partly smooth functions, this last requirement is shown to hold under a mild strict complementarity condition. We highlight, in particular, that these results do not require local smoothness of the objective around the limit points. To the best of our knowledgde, the presented algorithms constitute the first second-order proximal-gradient algorithms which provably escape nonsmooth strict saddle points, irrespective of the initial iterate. Our approach is based on the so-called forward-backward envelope (FBE), which has previously enabled the design of globally convergent linesearch methods. As an intermediate step in our analysis, we describe conditions under which the FBE is locally of class C2 around critical points, and establish an equivalence between the second-order stationary points of the original objective and those of the FBE. Finally, numerical experiments validate the proposed methods, both on illustrative toy problems, and on sparse principal component analysis and phase retrieval problems.

Keywords

Status: accepted


Back to the list of papers