Loading…
Multi-weight vector projection support vector machines
Proximal support vector machine via generalized eigenvalues (GEPSVM), as a variant of SVM, is originally motivated to effectively classify XOR problems that are not linearly separable. Through analysis and experiments, it has been shown to be better than SVM in favor of reduction of time complexity....
Saved in:
Published in: | Pattern recognition letters 2010-10, Vol.31 (13), p.2006-2011 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Proximal support vector machine via generalized eigenvalues (GEPSVM), as a variant of SVM, is originally motivated to effectively classify XOR problems that are not linearly separable. Through analysis and experiments, it has been shown to be better than SVM in favor of reduction of time complexity. However, the major disadvantages of GEPSVM lie in two aspects: (1) some complex XOR problems cannot be effectively classified; (2) it may fail to get a stable solution due to the matrix singularity occurring. By defining a new principle, we propose an original algorithm, called
multi-weight vector support vector machines (MVSVM). The proposed method not only keeps the superior characteristics of GEPSVM, but also has its additional edges: (1) it performs well on complex XOR datasets; (2) instead of generalized eigenvalue problems in GEPSVM, MVSVM solves two standard eigenvalue problems to avoid the matrix singularity of GEPSVM; (3) it has comparable or better generalization ability compared to SVM and GEPSVM; (4) it is the fastest among three algorithms. Experiments tried out on artificial and public datasets also indicate the effectiveness of MVSVM. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2010.06.005 |