EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
827. Slowly varying regression under sparsity
Invited abstract in session TC-27: Mathematical Optimization for Trustworthy Machine Learning, stream Mathematical Optimization for XAI.
Tuesday, 12:30-14:00Room: 047 (building: 208)
Authors (first author is the speaker)
1. | Vassilis Digalakis
|
Information systems and operations management, HEC Paris |
Abstract
I will present the framework of slowly varying regression under sparsity, allowing sparse regression models to exhibit slow and sparse variations, through an application in energy consumption prediction. First, I will formulate the problem of parameter estimation as a mixed-integer optimization problem; then, I will demonstrate that it can be precisely reformulated as a binary convex optimization problem through a novel relaxation technique, convexifying the non-convex objective function while matching the original objective on all feasible binary points. I will develop a highly optimized implementation of a cutting plant-type algorithm, a fast regularization-based heuristic method that guarantees a feasible solution, and a practical hyperparamrter tuning procedure relying on binary search that, under certain assumptions, is guaranteed to recover the true model parameters.
Keywords
- Machine Learning
- Combinatorial Optimization
- Artificial Intelligence
Status: accepted
Back to the list of papers