296. A primal-dual algorithm for variational image reconstruction with learned convex regularizers
Invited abstract in session MC-5: Optimization and machine learning II, stream Optimization for machine learning.
Monday, 14:00-16:00Room: B100/4013
Authors (first author is the speaker)
| 1. | Hok Shing Wong
|
| Mathematical Sciences, University of Bath |
Abstract
We address the optimization problem in a data-driven variational reconstruction framework, where the regularizer is parameterized by an input-convex neural network (ICNN). While gradient-based methods are commonly used to solve such problems, they struggle to effectively handle non-smooth problems which often leads to slow convergence. Moreover, the nested structure of the neural network complicates the application of standard non-smooth optimization techniques, such as proximal algorithms. To overcome these challenges, we reformulate the problem and eliminate the network's nested structure. By relating this reformulation to epigraphical projections of the activation functions, we transform the problem into a convex optimization problem that can be efficiently solved using a primal-dual algorithm. We also prove that this reformulation is equivalent to the original variational problem. Through experiments on several imaging tasks, we demonstrate that the proposed approach outperforms subgradient methods in terms of both speed and stability.
Keywords
- First-order optimization
- Non-smooth optimization
Status: accepted
Back to the list of papers