152. An Inexact Restoration based algorithm with random models for unconstrained noisy optimization
Invited abstract in session WE-5: Randomized optimization algorithms part 1/2, stream Randomized optimization algorithms.
Wednesday, 14:10 - 15:50Room: M:N
Authors (first author is the speaker)
| 1. | Simone Rebegoldi
|
| Dipartimento di Scienze Fisiche, Informatiche e Matematiche, Università di Modena e Reggio Emilia | |
| 2. | Benedetta Morini
|
| Dipartimento di Ingegneria Industriale, Universita di Firenze |
Abstract
In this talk, we focus on unconstrained differentiable optimization problems where the evaluation of the objective function and its gradient is noisy. First, we consider a constrained reformulation of the original problem based on the Inexact Restoration (IR) approach where y is the noise level of the evaluation, h(y) is a non-negative value measuring its accuracy in probability, and the constraint h(y) = 0 represents the ideal case where the estimates are evaluated exactly. Such a reformulation is viable whenever one uses sample average approximations as the noisy evaluations of both function and gradient. Then, we propose a trust-region algorithm with first-order random models that leverages the IR constrained reformulation of the problem. At each iteration, the proposed algorithm enforces the sufficient accuracy in probability of the function and gradient estimates, and then employs an acceptance test involving both the noisy function and the infeasibility measure. We prove an iteration complexity result that bounds the expected number of iterations needed to achieve an approximate first-order optimality point. The numerical experiments on some least squares problems show that the proposed algorithm tends to compute larger trust-region radii than other previously proposed trust-region algorithms with random models, and compares well with them.
Keywords
- Linear and nonlinear optimization
- Optimization under uncertainty and applications
Status: accepted
Back to the list of papers