EURO 2025 Leeds
Abstract Submission

3083. Explainable AI and Multicriteria Decision Support : Real World Applications and Perspectives

Invited abstract in session WD-8: Real-life applications of multi-criteria and multi-group decision making: practical issues, opportunities and challenges, stream Multiple Criteria Decision Aiding.

Wednesday, 14:30-16:00
Room: Clarendon SR 2.08

Authors (first author is the speaker)

1. Inès SAAD
Laboratory MIS (UPJV), ESC Amiens

Abstract

The rise of artificial intelligence systems in decision domains such as healthcare, cybersecurity, data management, and education highlights the growing need for explainable models. While machine learning approaches achieve remarkable performance levels, their opacity remains a major barrier to adoption, particularly in contexts where transparency and justification of decisions are essential.
The Dominance-based Rough Set Approach (DRSA) stands out as a particularly suitable method for multi-criteria classification problems, offering interpretable models that align with decision-makers' preferences.
A key advantage of DRSA lies in its use of « Reference Actions », constructed in collaboration with experts, to extract and formalize their preferences in the form of "If-Then" decision rules. These rules, both comprehensible and traceable, ensure better adoption of models by end-users.
In this study, we present several extensions of DRSA, designed and applied to real-world cases across multiple sectors. We demonstrate how our explainable models go beyond providing accurate predictions by enabling decision makers such as medical professionals, cybersecurity specialists, educators, and community managers to understand, interpret, and trust the generated recommendations. Our findings emphasize that DRSA serves as a key enabler for transparent, justifiable, and expert aligned decision-making.

Keywords

Status: accepted


Back to the list of papers