Accelerated and preconditioned Douglas-Rachford algorithms for the solution of variational imaging problems

Invited abstract in session TA-54: Large scale structured optimization 2, stream Convex Optimization.

Area: Continuous Optimization

Tuesday, 8:30-10:00
Room: Building PA, Room B

Authors (first author is the speaker)

1. Kristian Bredies
Institute for Mathematics and Scientific Computing, University of Graz
2. Hongpeng Sun
Institute for Mathematical Sciences, Renmin University of China


We present and discuss accelerated and preconditioned versions of the Douglas-Rachford (DR) splitting method for the solution of convex-concave saddle-point problems which often arise in variational imaging. These algorithms solely depend on implicit steps such as proximal operator evaluation as well as solution of linear equations and do not require step-size constraints. While the basic DR iteration admits weak and ergodic convergence with rate O(1/k) for restricted primal-dual gaps, acceleration enables, under appropriate strong convexity assumptions, to obtain convergence rates of O(1/k^2) and O(q^k) for 0 < q < 1. Furthermore, all methods allow for the incorporation of preconditioners for the implicit linear step while preserving the convergence properties. Neither the error nor the step-sizes have to be controlled for the inexact linear inversion.
The methods are applied to non-smooth and convex variational imaging problems. We discuss in particular variational models with total variation (TV) as well as total generalized variation (TGV) penalty. Preconditioners which are specific to these penalties are presented, the results of numerical experiments are shown and the benefits of the respective accelerated and preconditioned algorithms are discussed.


Status: accepted

Back to the list of papers