EUROPT 2024
Abstract Submission

141. Accelerated Algorithms For Nonlinear Matrix Decomposition With The Relu Function

Invited abstract in session WC-5: Optimization for learning I, stream Optimization for learning.

Wednesday, 10:05 - 11:20
Room: M:N

Authors (first author is the speaker)

1. Giovanni Seraghiti
Umons
2. Arnaud Vandaele
Mathematics and Operations Research, University of Mons
3. Margherita Porcelli
Dipartimento di Ingegneria Industriale, Università degli Studi di Firenze
4. Nicolas Gillis
Mathematics and Operational Research, Université de Mons

Abstract

In this contribution I propose a new problem in low-rank matrix factorization, that is the Nonlinear Matrix Decomposition (NMD): given a sparse nonnegative matrix, find a low-rank approximation, that recovers the original matrix by the application of an element-wise nonlinear function. I will focus on the so-called ReLu-NMD, where the nonlinear function is the rectified unit (ReLu) non-linear activation.

At first, I will provide a brief overview of the motivations and possible interpretations of the model, supported by theoretical examples. I will explain the idea that stands behind ReLU-NMD and how nonlinearity can be exploited to get low-rank approximation of given data.

Then, I will stress the connection with neural networks and I will present some of the the existing approaches developed to tackle ReLu-NMD.

Furthermore, I will introduce two new algorithms: (1)Aggressive Accelerated NMD (A-NMD) which uses an adaptive Nesterov extrapolation to accelerate an existing algorithm, and (2)Three-Block NMD (3B-NMD) which parametrizes the low-rank approximation in two factors and leads to a significant reduction in the computational cost.

Finally, I will illustrate the effectiveness of the proposed algorithms on synthetic and real-world data sets, providing some possible applications.

Keywords

Status: accepted


Back to the list of papers