The variable metric forward-backward splitting algorithm under mild differentiability assumptions and line searches

Invited abstract in session MB-54: Projection methods in optimization problems 2, stream Convex Optimization.

Area: Continuous Optimization

Monday, 10:30-12:00
Room: Building PA, Room B

Authors (first author is the speaker)

1. Saverio Salzo
LCSL, Istituto Italiano di Tecnologia


We study the variable metric forward-backward splitting algorithm for convex minimization
problems without the standard assumption of the Lipschitz continuity of the gradient.
In this setting, we prove that, by requiring only mild assumptions on the smooth part of the objective function and using several types of backtracking line search procedures for determining the step lengths or the relaxation parameters, one still obtains weak convergence of the iterates and convergence in the objective function values. Moreover, the o(1/k) convergence rate in the function values is obtained if slightly stronger differentiability assumptions are added.
Our results extend and unify several studies on variable metric proximal/projected gradient methods
under different hypotheses. We finally address applications and eventually show that, using the proposed line search procedures, the scope of applicability of the variable metric forward-backward splitting algorithm can be considerably enlarged, up to include problems that involves Banach spaces and smooth functions of divergent type.


Status: accepted

Back to the list of papers