EUROPT 2025
Abstract Submission

335. Bi- and Multi-Level Optimization Strategies for Sparse and Interpretable Learning in NMF

Invited abstract in session TC-3: Theoretical and algorithmic advances in large scale nonlinear optimization and applications Part 2, stream Large scale optimization: methods and algorithms.

Tuesday, 14:00-16:00
Room: B100/4011

Authors (first author is the speaker)

1. Laura Selicato
CNR - IRSA

Abstract

Learning algorithms require careful hyperparameter tuning, especially in penalized optimization where constraints aid in extracting interpretable, low-dimensional data representations. This work explores bi-level optimization problems for Nonnegative Matrix Factorization (NMF), a powerful tool for capturing latent structures while preserving nonnegativity. We introduce AltBi, a novel algorithm integrating penalty hyperparameter tuning into the Kullback-Leibler NMF update rules as a bi-level optimization problem. Additionally, we present AltBi-J, a constrained Frobenius norm approach using a diversity-based penalty which shows superior sparsity capabilities compared to traditional penalties. Extending this to the spectral domain, we propose the SHINBO algorithm, a bi-level optimization method applied to Itakura-Saito NMF, which adaptively tunes row-wise penalties to enhance the extraction of periodic signals from noisy data, with effective application to fault detection in mechanical systems. Motivated by learning applications involving multiple constraints, bi-level methodology can be generalized to a multi-level optimization framework capable of handling nested nonconvex problems, advancing the theoretical and algorithmic foundations for learning in structured optimization paradigms.
This is joint work with Flavia Esposito and Nicoletta Del Buono from University of Bari Aldo Moro, Andersen Ang from University of Southampton, and Rafał Zdunek from Politechnika Wrocławska.

Keywords

Status: accepted


Back to the list of papers