3111. A Quantum and Binary Incremental Learning Procedure for Categorical and Ordinal Classification
Invited abstract in session TC-38: Forecasting, prediction and optimization 3, stream Data Science meets Optimization.
Tuesday, 12:30-14:00Room: Michael Sadler LG19
Authors (first author is the speaker)
| 1. | Yueh LIN
|
| quantitative methods, IESEG school of management | |
| 2. | Luis Perez
|
| Operations Management, IESEG | |
| 3. | Stefano Nasini
|
| IESEG School of Management | |
| 4. | Martine Labbé
|
| computer Science, Université Libre de Bruxelles |
Abstract
This paper explores mathematical programming approaches for categorical and ordinal classification, leveraging decision tree and disjunctive normal form representations. We analyze the equivalence conditions of these formulations and develop an incremental learning procedure that solves a sequence of reduced problems. We examine both linear (MILP) and quadratic (QUBO) formulations, solving the latter using quantum adiabatic/variational techniques. Our incremental learning approach strategically reduces problem size by iteratively refining feature selection, enabling efficient solutions while preserving classification accuracy. Computational experiments demonstrate that our method significantly accelerates convergence while achieving optimal (or near-optimal) classification accuracy across benchmark datasets. Moreover, our results reveal that quantum adiabatic/variational methods can outperform traditional branch-and-bound and feasibility pump algorithms for small to medium-sized instances. These findings lay the foundation for a unified, optimization-driven approach to statistical classification, bridging classical and quantum computational paradigms.
Keywords
- Optimization Modeling
- Combinatorial Optimization
- Machine Learning
Status: accepted
Back to the list of papers