884. Statistical Performance of Subgradient Step-Size Update Rules in Lagrangian Relaxations of Chance-Constrained Optimization Models
Invited abstract in session WC-31: Solution Algorithms for Optimization under Uncertainty 2, stream Stochastic and Robust optimization.
Wednesday, 12:30-14:00Room: Maurice Keyworth 1.06
Authors (first author is the speaker)
| 1. | Bismark Singh
|
| School of Mathematical Sciences, University of Southampton |
Abstract
Lagrangian relaxation schemes, coupled with a subgradient procedure, are frequently employed to solve chance-constrained optimization models. Subgradient procedures typically rely on step-size update rules. Although there is extensive research on the properties of these step-size update rules, there is little consensus on which rules are most suitable practically; especially, when the underlying model is a computationally challenging instance of a chance-constrained program. To close this gap, we seek to determine whether a single step-size rule can be statistically guaranteed to perform better than others. We couple the Lagrangian procedure with three strategies to identify lower bounds for two-stage chance-constrained programs. We consider two instances of such models that differ in the presence of binary variables in the second-stage. With a series of computational experiments, we demonstrate—in marked contrast to existing theoretical results—that no significant statistical differences in terms of optimality gaps is detected between six well-known step-size update rules. Despite this, our results demonstrate that a Lagrangian procedure provides computational benefit over a naive solution method—regardless of the underlying step-size update rule.
This work is published here: https://doi.org/10.1007/978-3-031-47859-8_26.
Keywords
- Programming, Stochastic
- Programming, Mixed-Integer
- Non-smooth Optimization
Status: accepted
Back to the list of papers