72. New gradient methods with quasi-Newton property and quasi-Newton update
Invited abstract in session MD-35: Nonlinear Optimization Algorithms and Applications: 3, stream Continuous and mixed-integer nonlinear programming: theory and algorithms.
Monday, 14:30-16:00Room: Michael Sadler LG15
Authors (first author is the speaker)
| 1. | Cong Sun
|
| School of Science, Beijing University of Posts and Telecommunications |
Abstract
New stepsizes for gradient method are proposed, from the perspective of quasi-Newton property. A unified stepsize update strategy is proposed through quasi-Newton update formulas. The global convergence of the proposed gradient method is proved, with the combination of GLL nonmonotone linesearch. The convergence rate is also analyzed. In numerical experiments, different gradient methods representing different quasi-Newton update strategies are compared. It turns out that the gradient method with BFGS update performs the best. It outperforms the benchmark gradient and conjugate gradient methods in terms of computational cost.
Keywords
- Continuous Optimization
- Programming, Nonlinear
- Algorithms
Status: accepted
Back to the list of papers