EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
1638. Convergence analysis of optimization-by-continuation proximal gradient algorithm and some primal-dual extensions
Invited abstract in session MA-34: Optimization and learning for data science and imaging (Part I), stream Advances in large scale nonlinear optimization.
Monday, 8:30-10:00Room: 43 (building: 303A)
Authors (first author is the speaker)
1. | Ignace Loris
|
Mathematics Department, Université libre de Bruxelles |
Abstract
In this work we focus on the concept of optimization-by-continuation as a strategy for solving a set of optimization problems in about the time it would take to solve a single instance of them. Each cost function is a different linear combination of two convex terms, one differentiable and the other prox-simple. Such optimization problems occur frequently in the numerical solution of inverse problems (data misfit term plus penalty or constraint term), and the relative weight of both terms is often not known in advance. The algorithm's special feature lies in its ability to approximate, in a single iteration run, the minimizers of the cost function for many different values of the parameters determining the relative weight of the two terms. We also discuss the same problem in the presence of a third term in the cost function which is the combination of a prox-simple function and a linear map. As a special case, one recovers a generalization of the primal-dual algorithm of Chambolle and Pock.
Keywords
- Convex Optimization
- Large Scale Optimization
- Algorithms
Status: accepted
Back to the list of papers