Loading…
Imposing Symmetry in Least Squares Support Vector Machines Regression
In this paper we show how to use relevant prior information by imposing symmetry conditions (odd or even) to the Least Squares Support Vector Machines regression formulation. This is done by adding a simple constraint to the LS-SVM model, which finally translates into a new kernel. This equivalent k...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper we show how to use relevant prior information by imposing symmetry conditions (odd or even) to the Least Squares Support Vector Machines regression formulation. This is done by adding a simple constraint to the LS-SVM model, which finally translates into a new kernel. This equivalent kernel embodies the prior information about symmetry, and therefore the dimension of the final dual system is the same as the unrestricted case. We show that using a regularization term and a soft constraint provides a general framework which contains the unrestricted LS-SVM and the symmetry-constrained LS-SVM as extreme cases. Imposing symmetry improves substantially the performance of the models, which can be seen in terms of generalization ability and in the reduction of model complexity. Practical examples of NARX models and time series prediction show satisfactory results. |
---|---|
ISSN: | 0191-2216 |
DOI: | 10.1109/CDC.2005.1583074 |