EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
2251. A regression model for selecting effective cuts in a Branch-and-Cut algorithm
Invited abstract in session MD-3: (Deep) Reinforcement Learning for Combinatorial Optimization 2, stream Data Science Meets Optimization.
Monday, 14:30-16:00Room: 1005 (building: 202)
Authors (first author is the speaker)
1. | Marcello Sammarra
|
Istituto di Calcolo e Reti ad Alte Prestazioni, Consiglio Nazionale delle Ricerche | |
2. | Giovanni Giallombardo
|
DIMES, University of Calabria | |
3. | Giovanna Miglionico
|
DIMES, Università della Calabria |
Abstract
The performance of a Branch-and-Cut algorithm lies, among other things, in the ability of selecting a given number of available cuts to be added to the current LP relaxation, in order get to tight bounds. Clearly, adding many cuts the best bound improvement can be achieved, but, as a counterpart, also the time to solve the LP sensibly increases. Moreover, it is possible that only a fraction of the added cuts really contributes to the bound improvement, the remaining being not tight. This is to say the cut selection is a very challenging task. Modern ILP solvers, like CPLEX, GUROBI, XPRESS, MOSEK adopt greedy rules. Cuts are first ranked with a score function; then, cuts with the highest scores are added to the LP relaxation. In general, a cut score is a combination of well-established measures of cut quality, e.g. violation, parallelism, etc.
Adopting as cut score the bound improvement, in this talk we show how a regression model can be learned to predict the cut scores for unseen ILP instances. Preliminary results and future research directions are discussed.
Keywords
- Programming, Integer
- Branch and Cut
Status: accepted
Back to the list of papers