Loading…
Imposing Symmetry in Least Squares Support Vector Machines Regression
In this paper we show how to use relevant prior information by imposing symmetry conditions (odd or even) to the Least Squares Support Vector Machines regression formulation. This is done by adding a simple constraint to the LS-SVM model, which finally translates into a new kernel. This equivalent k...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 5721 |
container_issue | |
container_start_page | 5716 |
container_title | |
container_volume | |
creator | Espinoza, M. Suykens, J.A.K. De Moor, B. |
description | In this paper we show how to use relevant prior information by imposing symmetry conditions (odd or even) to the Least Squares Support Vector Machines regression formulation. This is done by adding a simple constraint to the LS-SVM model, which finally translates into a new kernel. This equivalent kernel embodies the prior information about symmetry, and therefore the dimension of the final dual system is the same as the unrestricted case. We show that using a regularization term and a soft constraint provides a general framework which contains the unrestricted LS-SVM and the symmetry-constrained LS-SVM as extreme cases. Imposing symmetry improves substantially the performance of the models, which can be seen in terms of generalization ability and in the reduction of model complexity. Practical examples of NARX models and time series prediction show satisfactory results. |
doi_str_mv | 10.1109/CDC.2005.1583074 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_1583074</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1583074</ieee_id><sourcerecordid>1583074</sourcerecordid><originalsourceid>FETCH-LOGICAL-i175t-1844caf8b637f67028e7b784b7dfd24356b5133b981dfcbf6b5fd8ebebed89903</originalsourceid><addsrcrecordid>eNotT0tLxDAYDKjguu5d8JI_0Pp9TfM6Sl11oSJY9bo0bbJG7MOke-i_t-AyA8PMwMAQcoOQIoK-Kx6KNAPgKXLFQOZnZKOlgoVMcyHxnKwANSZZhuKSXMX4DQAKhFiR7a4bh-j7A63mrrNTmKnvaWnrONHq91gHG2l1HMchTPTTNtMQ6EvdfPl-yd_sYamjH_prcuHqn2g3J12Tj8fte_GclK9Pu-K-TDxKPiWo8rypnTKCSSckZMpKI1VuZOvaLGdcGI6MGa2wdY1xi3WtsmZBq7QGtia3_7veWrsfg-_qMO9Pr9kfasBL3w</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Imposing Symmetry in Least Squares Support Vector Machines Regression</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Espinoza, M. ; Suykens, J.A.K. ; De Moor, B.</creator><creatorcontrib>Espinoza, M. ; Suykens, J.A.K. ; De Moor, B.</creatorcontrib><description>In this paper we show how to use relevant prior information by imposing symmetry conditions (odd or even) to the Least Squares Support Vector Machines regression formulation. This is done by adding a simple constraint to the LS-SVM model, which finally translates into a new kernel. This equivalent kernel embodies the prior information about symmetry, and therefore the dimension of the final dual system is the same as the unrestricted case. We show that using a regularization term and a soft constraint provides a general framework which contains the unrestricted LS-SVM and the symmetry-constrained LS-SVM as extreme cases. Imposing symmetry improves substantially the performance of the models, which can be seen in terms of generalization ability and in the reduction of model complexity. Practical examples of NARX models and time series prediction show satisfactory results.</description><identifier>ISSN: 0191-2216</identifier><identifier>ISBN: 9780780395671</identifier><identifier>ISBN: 0780395670</identifier><identifier>DOI: 10.1109/CDC.2005.1583074</identifier><language>eng</language><publisher>IEEE</publisher><subject>Cost function ; Kernel ; Least squares approximation ; Least squares methods ; Linear systems ; Nonlinear systems ; Power system modeling ; Predictive models ; Quadratic programming ; Support vector machines</subject><ispartof>Proceedings of the 44th IEEE Conference on Decision and Control, 2005, p.5716-5721</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1583074$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,4050,4051,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1583074$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Espinoza, M.</creatorcontrib><creatorcontrib>Suykens, J.A.K.</creatorcontrib><creatorcontrib>De Moor, B.</creatorcontrib><title>Imposing Symmetry in Least Squares Support Vector Machines Regression</title><title>Proceedings of the 44th IEEE Conference on Decision and Control</title><addtitle>CDC</addtitle><description>In this paper we show how to use relevant prior information by imposing symmetry conditions (odd or even) to the Least Squares Support Vector Machines regression formulation. This is done by adding a simple constraint to the LS-SVM model, which finally translates into a new kernel. This equivalent kernel embodies the prior information about symmetry, and therefore the dimension of the final dual system is the same as the unrestricted case. We show that using a regularization term and a soft constraint provides a general framework which contains the unrestricted LS-SVM and the symmetry-constrained LS-SVM as extreme cases. Imposing symmetry improves substantially the performance of the models, which can be seen in terms of generalization ability and in the reduction of model complexity. Practical examples of NARX models and time series prediction show satisfactory results.</description><subject>Cost function</subject><subject>Kernel</subject><subject>Least squares approximation</subject><subject>Least squares methods</subject><subject>Linear systems</subject><subject>Nonlinear systems</subject><subject>Power system modeling</subject><subject>Predictive models</subject><subject>Quadratic programming</subject><subject>Support vector machines</subject><issn>0191-2216</issn><isbn>9780780395671</isbn><isbn>0780395670</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2005</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotT0tLxDAYDKjguu5d8JI_0Pp9TfM6Sl11oSJY9bo0bbJG7MOke-i_t-AyA8PMwMAQcoOQIoK-Kx6KNAPgKXLFQOZnZKOlgoVMcyHxnKwANSZZhuKSXMX4DQAKhFiR7a4bh-j7A63mrrNTmKnvaWnrONHq91gHG2l1HMchTPTTNtMQ6EvdfPl-yd_sYamjH_prcuHqn2g3J12Tj8fte_GclK9Pu-K-TDxKPiWo8rypnTKCSSckZMpKI1VuZOvaLGdcGI6MGa2wdY1xi3WtsmZBq7QGtia3_7veWrsfg-_qMO9Pr9kfasBL3w</recordid><startdate>2005</startdate><enddate>2005</enddate><creator>Espinoza, M.</creator><creator>Suykens, J.A.K.</creator><creator>De Moor, B.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>2005</creationdate><title>Imposing Symmetry in Least Squares Support Vector Machines Regression</title><author>Espinoza, M. ; Suykens, J.A.K. ; De Moor, B.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i175t-1844caf8b637f67028e7b784b7dfd24356b5133b981dfcbf6b5fd8ebebed89903</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2005</creationdate><topic>Cost function</topic><topic>Kernel</topic><topic>Least squares approximation</topic><topic>Least squares methods</topic><topic>Linear systems</topic><topic>Nonlinear systems</topic><topic>Power system modeling</topic><topic>Predictive models</topic><topic>Quadratic programming</topic><topic>Support vector machines</topic><toplevel>online_resources</toplevel><creatorcontrib>Espinoza, M.</creatorcontrib><creatorcontrib>Suykens, J.A.K.</creatorcontrib><creatorcontrib>De Moor, B.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Espinoza, M.</au><au>Suykens, J.A.K.</au><au>De Moor, B.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Imposing Symmetry in Least Squares Support Vector Machines Regression</atitle><btitle>Proceedings of the 44th IEEE Conference on Decision and Control</btitle><stitle>CDC</stitle><date>2005</date><risdate>2005</risdate><spage>5716</spage><epage>5721</epage><pages>5716-5721</pages><issn>0191-2216</issn><isbn>9780780395671</isbn><isbn>0780395670</isbn><abstract>In this paper we show how to use relevant prior information by imposing symmetry conditions (odd or even) to the Least Squares Support Vector Machines regression formulation. This is done by adding a simple constraint to the LS-SVM model, which finally translates into a new kernel. This equivalent kernel embodies the prior information about symmetry, and therefore the dimension of the final dual system is the same as the unrestricted case. We show that using a regularization term and a soft constraint provides a general framework which contains the unrestricted LS-SVM and the symmetry-constrained LS-SVM as extreme cases. Imposing symmetry improves substantially the performance of the models, which can be seen in terms of generalization ability and in the reduction of model complexity. Practical examples of NARX models and time series prediction show satisfactory results.</abstract><pub>IEEE</pub><doi>10.1109/CDC.2005.1583074</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0191-2216 |
ispartof | Proceedings of the 44th IEEE Conference on Decision and Control, 2005, p.5716-5721 |
issn | 0191-2216 |
language | eng |
recordid | cdi_ieee_primary_1583074 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Cost function Kernel Least squares approximation Least squares methods Linear systems Nonlinear systems Power system modeling Predictive models Quadratic programming Support vector machines |
title | Imposing Symmetry in Least Squares Support Vector Machines Regression |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T19%3A28%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Imposing%20Symmetry%20in%20Least%20Squares%20Support%20Vector%20Machines%20Regression&rft.btitle=Proceedings%20of%20the%2044th%20IEEE%20Conference%20on%20Decision%20and%20Control&rft.au=Espinoza,%20M.&rft.date=2005&rft.spage=5716&rft.epage=5721&rft.pages=5716-5721&rft.issn=0191-2216&rft.isbn=9780780395671&rft.isbn_list=0780395670&rft_id=info:doi/10.1109/CDC.2005.1583074&rft_dat=%3Cieee_6IE%3E1583074%3C/ieee_6IE%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i175t-1844caf8b637f67028e7b784b7dfd24356b5133b981dfcbf6b5fd8ebebed89903%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=1583074&rfr_iscdi=true |