EURO 2024 Copenhagen
Abstract Submission

EURO-Online login

252. On Convergence of Iterative Thresholding Algorithms to Global Solution for Nonconvex Sparse Optimization

Invited abstract in session TD-41: Lower-order composite optimization problems, stream Nonsmooth Optimization.

Tuesday, 14:30-16:00
Room: 97 (building: 306)

Authors (first author is the speaker)

1. Yaohua Hu
School of Mathematical Sciences, Shenzhen University

Abstract

Sparse optimization is a popular research topic in applied mathematics and optimization, and nonconvex sparse regularization problems have been extensively studied to ameliorate the statistical bias and enjoy robust sparsity promotion capability in vast applications. However, puzzled by the nonconvex and nonsmooth structure in nonconvex regularization problems, the convergence theory of their optimization algorithms is still far from completion: only the convergence to a stationary point was established in the literature, while there is still no theoretical evidence to guarantee the convergence to a global minimum or a true sparse solution.
This talk aims to find an approximate global solution or true sparse solution of an under-determined linear system. We will propose two types of iterative thresholding algorithms with the continuation technique and the truncation technique respectively. We introduce a notion of limited shrinkage thresholding operator and apply it, together with the restricted isometry property, to show that the proposed algorithms converge to an approximate global solution or true sparse solution within a tolerance relevant to the noise level and the limited shrinkage magnitude. Applying the obtained results to nonconvex regularization problems with SCAD, MCP and Lp penalty and utilizing the recovery bound theory, we establish the convergence of their proximal gradient algorithms to an approximate global solution of nonconvex regularization problems.

Keywords

Status: accepted


Back to the list of papers