EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
298. Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
Invited abstract in session TD-32: Algorithms for machine learning and inverse problems: optimisation for neural networks, stream Advances in large scale nonlinear optimization.
Tuesday, 14:30-16:00Room: 41 (building: 303A)
Authors (first author is the speaker)
1. | Antonio Silveti-Falls
|
Centre for Visual Computing, CentraleSupélec |
Abstract
Understanding the differentiability and regularity of the solution to a monotone inclusion problem is an important question with consequences for convex optimization, machine learning, signal processing, and beyond. Past attempts have been made either under very restrictive assumptions that ensure the solution is continuously differentiable or using mathematical tools that are incompatible with automatic differentiation. In this talk, we discuss how to leverage path differentiability and a recent result on nonsmooth implicit differentiation calculus to give sufficient conditions ensuring that the solution to a monotone inclusion problem will be path differentiable and provide formulas for computing its generalized gradient. Our approach is fully compatible with automatic differentiation and comes with assumptions which are easy to check, roughly speaking: semialgebraicity and strong monotonicity. We illustrate the scope of our results by considering three fundamental composite problem settings: strongly convex problems, dual solutions to convex minimization problems and primal-dual solutions to min-max problems.
Keywords
- Non-smooth Optimization
- Large Scale Optimization
- Machine Learning
Status: accepted
Back to the list of papers