EURO 2025 Leeds
Abstract Submission

156. Improved intuitionistic fuzzy least square twin support vector machine for class imbalance learning

Invited abstract in session WB-38: Optimization and Machine Learning: Methodological Advances, stream Data Science meets Optimization.

Wednesday, 10:30-12:00
Room: Michael Sadler LG19

Authors (first author is the speaker)

1. Yash Arora
Department of Mathematics, IIT Roorkee
2. S. K. Gupta
Indian Institute of Technology Roorkee

Abstract

In the data mining community, imbalanced datasets pose a persistent challenge that affects the performance of classification problems. Although support vector machine (SVM) is a well-established classification technique, it struggles with imbalanced datasets because it assigns equal importance to all training samples. Several algorithm-level approaches improve the performance of SVM on imbalanced datasets. However, they often face challenges when dealing with noise and outliers. To address these issues, we propose an improved intuitionistic fuzzy least square twin SVM for imbalanced datasets. It introduces a novel membership function that integrates Gaussian and entropy-based score functions with the imbalance ratio. The Gaussian membership reduces the impact of noise in training points and the neighborhood entropy is used to calculate the non-membership to distinguish between support vectors and outliers. The imbalance ratio is incorporated into the membership function to address the skewed distribution of classes. The model’s efficiency is enhanced by solving two systems of linear equations to obtain the decision hyperplanes rather than solving a quadratic programming problem. To demonstrate the performance of the proposed method, comprehensive experiments are conducted on synthetic and imbalanced benchmark datasets. The experimental findings reveal that the proposed method outperforms other baseline models, highlighting its effectiveness in addressing realworld problems.

Keywords

Status: accepted


Back to the list of papers