EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
3936. Newton-type proximal gradient method with quadratic subproblem approximation for unconstrained optimization with L1-regularizer
Invited abstract in session WD-41: Nonsmooth optimization algorithms II, stream Nonsmooth Optimization.
Wednesday, 14:30-16:00Room: 97 (building: 306)
Authors (first author is the speaker)
1. | Yasushi Narushima
|
Department of Industrial and Systems Engineering, Keio University | |
2. | Hiroshi Ben
|
Keio University |
Abstract
We consider an unconstrained optimization problem whose objective function is expressed by the sum of a continuously differentiable function and an L1-regularizer. One approach for solving such problems, the Newton-type proximal gradient method, has gained attention. In Newton-type proximal gradient methods, we need to solve a sub-problem to obtain the scaled proximal mapping at each iteration. However, the computational cost of solving this subproblem is sometimes very high. In our study, we address this difficulty by approximating the sub-problem in a more easily solvable form. Concretely, we approximate the L1-norm within the objective function of the subproblem using a quadratic function. This method allows the solution to the subproblem to be found using only simple calculations. However, the obtained solution requires a system of linear equations to be solved. In response to this issue, we calculate an approximation to the Hessian in the Newton-type proximal gradient method using a memoryless quasi-Newton method. In addition, we use the Sherman-Morrison-Woodbury formula to obtain the closed-form solution to the inverse matrix with reduced computational costs. Thus, we easily obtain an approximate solution for the scaled proximal mapping. Based on the proposed method for the subproblem, we give an algorithm of Newton-type proximal gradient methods. In addition, we show the global convergence of the proposed algorithm under some standard assumptions.
Keywords
- Continuous Optimization
- Non-smooth Optimization
- Algorithms
Status: accepted
Back to the list of papers