EURO 2024 Copenhagen
Abstract Submission

EURO-Online login

260. Zeroth-order implementation of the regularized Newton method with lazy approximated Hessians

Invited abstract in session TC-32: Algorithms for machine learning and inverse problems: zeroth-order optimisation, stream Advances in large scale nonlinear optimization.

Tuesday, 12:30-14:00
Room: 41 (building: 303A)

Authors (first author is the speaker)

1. Geovani Grapiglia
Applied Mathematics, Université catholique de Louvain

Abstract

In this work, we develop a zeroth-order (derivative-free) implementation of the Cubically regularized Newton method for solving general non-convex optimization problems. Specifically, we consider the class of unconstrained problems whose the objective function is twice continuously differentiable with Lipschitz continuous Hessians. The proposed method employs finite difference approximations of gradient vectors and Hessian matrices. We use a special adaptive search procedure in our algorithms, which simultaneously fits both the regularization parameters of the models and the stepsizes of the finite difference approximations. Thanks to this procedure our scheme do not require the knowlege of the actual Lipschitz constants. Additionally, we equip our algorithm with the lazy Hessian update, meaning that the method reuse a previously computed Hessian approximation matrix for several iterations. In terms of worst-case complexity, we prove that the proposed method is able to find an $\epsilon$-approximate stationary point using at most $\mathcal{O}( n^{3/2} \epsilon^{-3/2} )$ function evaluations, where $n$ is the dimension of the problem. This complexity bound improve existing bounds in terms of the joint dependence on $n$ and $\epsilon$ for derivative-free methods applied to the minimization of functions with Lipschitz continuous Hessians. Illustrative numerical results confirm our theoretical findings.

Keywords

Status: accepted


Back to the list of papers