238. Developing Extended Memoryless Optimization Algorithms Based on the Ellipsoid Norms
Invited abstract in session FB-6: Higher-order Methods in Mathematical Programming II, stream Challenges in nonlinear programming.
Friday, 10:05 - 11:20Room: M:H
Authors (first author is the speaker)
| 1. | Saman Babaie-Kafaki
|
| Faculty of Engineering, Free University of Bozen-Bolzano |
Abstract
Although the Euclidean norm has been widely used in the analysis of algorithmic and modeling aspects of optimization, the ellipsoid norm has also been employed in the analytical spectrum of literature. Effective applications of the ellipsoid norm can be observed in the convergence analysis of steepest descent algorithm, devising quasi-Newton updates, and particularly in developing scaled trust region methods. As an extension of the Euclidean norm, the ellipsoid norm is capable of being flexibly utilized in designing new memoryless algorithms to address high-dimensional real-world optimization problems that frequently arise in the contemporary world. Therefore, it deserves more attention. Hence, in an attempt to promote diversity and inclusion in the optimization tools in accordance with the current standards, the ellipsoid norm is applied in the least-squares framework, using quasi-Newton updating formulas as the index of the norm. As part of this endeavor, several extended formulas for the scaling parameter of memoryless quasi-Newton algorithms are developed, which can also be regarded as scalar approximations of the (inverse) Hessian. Furthermore, through a similar approach, several adaptive choices are suggested for the parameter of the Dai-Liao method, a well-known class of nonlinear conjugate gradient algorithms. The effect of such extensions on hybrid conjugate gradient algorithms is also studied. Finally, computational experiment results are reported.
Keywords
- Linear and nonlinear optimization
- Large- and Huge-scale optimization
- Convex and non-smooth optimization
Status: accepted
Back to the list of papers