318. Global convergence of a second-order augmented Lagrangian method under an error bound condition
Invited abstract in session TB-6: Advances in nonsmooth optimization, stream Nonsmooth and nonconvex optimization.
Tuesday, 10:30-12:30Room: B100/7013
Authors (first author is the speaker)
| 1. | Renan William Prado
|
| Institute of Mathematics and Statistics, University of São Paulo | |
| 2. | Gabriel Haeser
|
| Department of Applied Mathematics, University of Sao Paulo | |
| 3. | Roberto Andreani
|
| Departament of Applied Mathematics, State University of Campinas | |
| 4. | Maria Laura Schuverdt
|
| CONICET, Department of Mathematics, FCE, Universidad Nacional de La Plata | |
| 5. | Leonardo Delarmelina Secchin
|
| Department of Applied Mathematics, Federal University of Espírito Santo |
Abstract
In this talk, we deal with convergence to points satisfying the weak second-order necessary optimality conditions of a second-order safeguarded augmented Lagrangian method from the literature. To this end, we propose a new second-order sequential optimality condition that is, in a certain way, based on the iterates generated by the algorithm itself. This also allows us to establish the best possible global convergence result for the method studied, from which a companion constraint qualification is derived. The companion constraint qualification is independent of the Mangasarian-Fromovitz and constant-rank constraint qualifications and remains verifiable without them, as it can be certified by different known constraint qualifications. Furthermore, unlike similar results from previous works, the new constraint qualification cannot be weakened by another one with second-order global convergence guarantees for the method and assures second-order stationarity without needing constant rank hypotheses. To guarantee the latter result, we established the convergence of the method under a property slightly stronger than the error bound constraint qualification, which, until now, has not been known to be associated with nonlinear optimization methods.
Keywords
- Second- and higher-order optimization
- Linear and nonlinear optimization
- Optimization software
Status: accepted
Back to the list of papers