EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
1348. Regularization Methods for Sparse Optimization with Applications
Invited abstract in session WB-41: Structured nonconvex optimization , stream Nonsmooth Optimization.
Wednesday, 10:30-12:00Room: 97 (building: 306)
Authors (first author is the speaker)
1. | Carisa K.W. Yu
|
Department of Mathematics, Statistics and Insurance, The Hang Seng University of Hong Kong | |
2. | Siu Kai Choy
|
Mathematics and Statistics, The Hang Seng University of Hong Kong |
Abstract
Sparse optimization is a practical approach for tackling numerous big data challenges. Extensive empirical studies in the field have demonstrated that nonconvex regularization methods exhibit a significantly stronger sparsity capability to promote sparsity and a notably more robust stability in sparse recovery compared to convex penalty methods. In this paper, we explore various nonconvex regularization models to address sparse optimization problems. Theoretical analysis is conducted to establish recovery bounds for these regularization models. Additionally, we propose the widely recognized proximal gradient method so as to solve the regularization problems and establish convergence rates for the numerical algorithms. Finally, we apply our theoretical findings and numerical algorithms to tackle the problem of inferring the master gene regulator in cell fate conversion, presenting compelling numerical results.
Keywords
- Algorithms
- Optimization Modeling
Status: accepted
Back to the list of papers