EURO 2024 Copenhagen
Abstract Submission

EURO-Online login

2414. Feature Selection for Regression Neural Network using Mathematical Programming

Invited abstract in session TD-27: Feature attribution and selection for XAI, stream Mathematical Optimization for XAI.

Tuesday, 14:30-16:00
Room: 047 (building: 208)

Authors (first author is the speaker)

1. Georgios Liapis
Chemical Engineering, University College London
2. Sophia Tsoka
Department of Informatics, King’s College London
3. Lazaros Papageorgiou
Chemical Engineering, University College London

Abstract

In recent years, a growing body of literature has focused on training sparse deep neural networks or refining already trained ones towards making them sparser. The strategic reduction of complexity of the network has shown to effectively mitigate overfitting, improve overall generalisation performance and enhance explainability. This work introduces a feature selection approach tailored for regression tasks in trained neural networks, utilising Rectified Linear Unit (ReLU) activation functions. The methodology simplifies the network by identifying and retaining only the most important features for the input layer. This problem is mathematically formulated as a Mixed-Integer Linear Programming (MILP) model. This model formulates the ReLU operator with binary variables, enabling the application of big-M constraints, while limiting the number of active features. The objective of the model is to identify and eliminate the features that do not significantly contribute to the prediction quality of the neural network, so that it can be re-trained using only the rest of them. A binning strategy is employed on the output variable in order to assess the importance of features across the entire spectrum of the output. Through a number of real-world datasets, it is demonstrated that the complexity of the network is reduced, while maintaining good predictive performance.

Keywords

Status: accepted


Back to the list of papers