313. The Boosted Double-Proximal Subgradient Algorithm for Nonconvex Optimization
Invited abstract in session WF-3: Splitting algorithms, stream Variational analysis: theory and algorithms.
Wednesday, 16:20 - 18:00Room: M:J
Authors (first author is the speaker)
| 1. | Francisco Javier Aragón Artacho
|
| Mathematics, University of Alicante |
Abstract
In this talk we present a new splitting algorithm that can be used to tackle very general structured nonconvex minimization problems. Specifically, we assume that the objective function can be expressed as the sum of a locally Lipschitz function f that satisfies the descent lemma plus a l.s.c. prox-bounded function g minus a finite sum of compositions of continuous convex functions and differentiable functions with Lipschitz continuous gradients. Our algorithm makes use of subgradients of f, the gradients of the differentiable functions, and proximal steps of the functions g and of the conjugates of the convex functions. As, in addition, it includes a line-search step that allows to improve its performance, we name it Boosted Double-proximal Subgradient Algorithm (BDSA). If time permits, we will conclude the talk with some numerical experiments demonstrating the good performance of BDSA.
This is a joint work with David Torregrosa-Belén and Pedro Pérez-Aros.
Keywords
- Analysis and engineering of optimization algorithms
- Linear and nonlinear optimization
Status: accepted
Back to the list of papers