123. Towards Convexity in Anomaly Detection: A New Formulation of SSLM with Unique Optimal Solutions
Invited abstract in session WB-38: Optimization and Machine Learning: Methodological Advances, stream Data Science meets Optimization.
Wednesday, 10:30-12:00Room: Michael Sadler LG19
Authors (first author is the speaker)
| 1. | Hao Wang
|
| Information Science and Technology, ShanghaiTech University |
Abstract
An unsolved issue in widely-used methods such as Support Vector Data Description (SVDD) and Small Sphere and Large Margin SVM (SSLM) for anomaly detection is their nonconvexity, which hampers the analysis of optimal solutions in a manner similar to SVMs and limits their applicability in large-scale scenarios. In this paper, we introduce a novel convex SSLM formulation which has been demonstrated to revert to a convex quadratic programming problem for hyperparameter values of interest. Leveraging the convexity of our method, we derive numerous results that are unattainable with traditional nonconvex approaches. We conduct a thorough analysis of how hyperparameters influence the optimal solution, pointing out scenarios where optimal solutions can be trivially found and identifying instances of ill-posedness. Most notably, we establish connections between our method and traditional approaches, providing a clear determination of when the optimal solution is unique—a task unachievable with traditional nonconvex methods. We also derive the ν-property to elucidate the interactions between hyperparameters and the fractions of support vectors and margin errors in both positive and negative classes.
Keywords
- Convex Optimization
- Machine Learning
- Programming, Quadratic
Status: accepted
Back to the list of papers