EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
560. Optimization over trained graph neural networks: mixed-integer formulations, symmetry breaking, and optimal molecular design
Invited abstract in session TA-4: Topics in Mixed Integer Programming and Nonconvex Optimization, stream MINLP.
Tuesday, 8:30-10:00Room: 1001 (building: 202)
Authors (first author is the speaker)
1. | Shiqiang Zhang
|
Department of Computing, Imperial College London |
Abstract
Optimization over trained machine learning models has applications including verification, minimizing neural acquisition functions, and integrating a trained surrogate into a larger decision-making problem. We formulate and solve optimization problems constrained by trained graph neural networks (GNNs). For the classical GNN architectures considered in this talk, optimizing over a GNN with a fixed graph is equivalent to optimizing over a dense neural network. Thus, we study the case where the input graph is not fixed, implying that each edge is a decision variable, and develop two mixed-integer optimization formulations. To circumvent the symmetry issue caused by graph isomorphism, we propose two types of symmetry-breaking constraints, and provide a proof that adding these constraints will not remove all symmetric solutions. As a byproduct, a graph indexing algorithm is constructed to yield an indexing satisfying these constraints. To test our symmetry-breaking strategies and optimization formulations, we consider an application in optimal molecular design.
Keywords
- Combinatorial Optimization
- Mathematical Programming
- Programming, Mixed-Integer
Status: accepted
Back to the list of papers