EURO 2024 Copenhagen
Abstract Submission

EURO-Online login

1348. Regularization Methods for Sparse Optimization with Applications

Invited abstract in session WB-41: Structured nonconvex optimization , stream Nonsmooth Optimization.

Wednesday, 10:30-12:00
Room: 97 (building: 306)

Authors (first author is the speaker)

1. Carisa K.W. Yu
Department of Mathematics, Statistics and Insurance, The Hang Seng University of Hong Kong
2. Siu Kai Choy
Mathematics and Statistics, The Hang Seng University of Hong Kong

Abstract

Sparse optimization is a practical approach for tackling numerous big data challenges. Extensive empirical studies in the field have demonstrated that nonconvex regularization methods exhibit a significantly stronger sparsity capability to promote sparsity and a notably more robust stability in sparse recovery compared to convex penalty methods. In this paper, we explore various nonconvex regularization models to address sparse optimization problems. Theoretical analysis is conducted to establish recovery bounds for these regularization models. Additionally, we propose the widely recognized proximal gradient method so as to solve the regularization problems and establish convergence rates for the numerical algorithms. Finally, we apply our theoretical findings and numerical algorithms to tackle the problem of inferring the master gene regulator in cell fate conversion, presenting compelling numerical results.

Keywords

Status: accepted


Back to the list of papers