EURO-Online login
- New to EURO? Create an account
- I forgot my username and/or my password.
- Help with cookies
(important for IE8 users)
766. Specialized interior point method for support vector machines based on variable splitting
Invited abstract in session WB-30: Specialized Optimization Algorithms, stream Software for Optimization.
Wednesday, 10:30-12:00Room: 064 (building: 208)
Authors (first author is the speaker)
1. | Jordi Castro
|
Dept. of Statistics and Operations Research, Universitat Politecnica de Catalunya |
Abstract
Variable splitting has traditionally been employed in (first order)
Lagrangian decomposition approaches and, more recently, in the Alternating
Direction Method of Multipliers (ADMM). In this presentation we will show how variable splitting can be efficiently used within (second order) interior point methods for solving large support vector machine (SVM) problems, a binary classification technique extensively used in machine learning. Briefly, by replicating variables, the SVM problem can be decomposed into smaller SVMs with additional linking constraints that equate the values of the different copies. The resulting problem's structure can be exploited by specialized interior point methods that compute the Newton direction with a combination of direct and iterative solvers (i.e., Cholesky factorizations and preconditioned conjugate gradients). This new approach is compared with state-of-the-art solvers for SVMs, which are based on either interior point algorithms (such as SVM-OOPS) or specific algorithms developed by the machine learning community (such as LIBSVM).
Reference:
J. Castro, New interior-point approach for one- and two-class linear
support vector machines using multiple variable splitting, Journal of
Optimization Theory and Applications, (2022), https://doi.org/10.1007/s10957-022-02103-1.
Keywords
- Interior Point Methods
- Machine Learning
- Software
Status: accepted
Back to the list of papers