Loading…

Complex Support Vector Machines for Regression and Quaternary Classification

The paper presents a new framework for complex support vector regression (SVR) as well as Support Vector Machines (SVM) for quaternary classification. The method exploits the notion of widely linear estimation to model the input-out relation for complex-valued data and considers two cases: 1) the co...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2015-06, Vol.26 (6), p.1260-1274
Main Authors: Bouboulis, Pantelis, Theodoridis, Sergios, Mavroforakis, Charalampos, Evaggelatou-Dalla, Leoni
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c578t-6624bb0058ace2c135a8dc7acb261271680f1a96c565cc90e7bbb259eca974813
cites cdi_FETCH-LOGICAL-c578t-6624bb0058ace2c135a8dc7acb261271680f1a96c565cc90e7bbb259eca974813
container_end_page 1274
container_issue 6
container_start_page 1260
container_title IEEE transaction on neural networks and learning systems
container_volume 26
creator Bouboulis, Pantelis
Theodoridis, Sergios
Mavroforakis, Charalampos
Evaggelatou-Dalla, Leoni
description The paper presents a new framework for complex support vector regression (SVR) as well as Support Vector Machines (SVM) for quaternary classification. The method exploits the notion of widely linear estimation to model the input-out relation for complex-valued data and considers two cases: 1) the complex data are split into their real and imaginary parts and a typical real kernel is employed to map the complex data to a complexified feature space and 2) a pure complex kernel is used to directly map the data to the induced complex feature space. The recently developed Wirtinger's calculus on complex reproducing kernel Hilbert spaces is employed to compute the Lagrangian and derive the dual optimization problem. As one of our major results, we prove that any complex SVM/SVR task is equivalent with solving two real SVM/SVR tasks exploiting a specific real kernel, which is generated by the chosen complex kernel. In particular, the case of pure complex kernels leads to the generation of new kernels, which have not been considered before. In the classification case, the proposed framework inherently splits the complex space into four parts. This leads naturally to solving the four class-task (quaternary classification), instead of the typical two classes of the real SVM. In turn, this rationale can be used in a multiclass problem as a split-class scenario based on four classes, as opposed to the one-versus-all method; this can lead to significant computational savings. Experiments demonstrate the effectiveness of the proposed framework for regression and classification tasks that involve complex data.
doi_str_mv 10.1109/TNNLS.2014.2336679
format article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_6868310</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6868310</ieee_id><sourcerecordid>1682887915</sourcerecordid><originalsourceid>FETCH-LOGICAL-c578t-6624bb0058ace2c135a8dc7acb261271680f1a96c565cc90e7bbb259eca974813</originalsourceid><addsrcrecordid>eNo9kMlOwzAQhi0EolXpC4CEcuSSYjvxdkQVmxSKoAVxsxx3AkHZsBMJ3h6Xls5ltn9-jT6ETgmeEYLV5WqxyJYzikk6o0nCuVAHaEwJpzFNpDzc1-JthKbef-IQHDOeqmM0ogwrRjkfo2ze1l0F39Fy6LrW9dEr2L510YOxH2UDPipC8wzvDrwv2yYyzTp6GkwPrjHuJ5pXJsyL0po-bE_QUWEqD9NdnqCXm-vV_C7OHm_v51dZbJmQfcw5TfMcYyaNBWpJwoxcW2FsTjmhgnCJC2IUt4wzaxUGkec5ZQqsUSKVJJmgi61v59qvAXyv69JbqCrTQDt4HRyolEIRFqR0K7Wu9d5BoTtX1uF1TbDegNR_IPUGpN6BDEfnO_8hr2G9P_nHFgRnW0EJAPs1l1wmBCe_ce53jg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1682887915</pqid></control><display><type>article</type><title>Complex Support Vector Machines for Regression and Quaternary Classification</title><source>IEEE Xplore (Online service)</source><creator>Bouboulis, Pantelis ; Theodoridis, Sergios ; Mavroforakis, Charalampos ; Evaggelatou-Dalla, Leoni</creator><creatorcontrib>Bouboulis, Pantelis ; Theodoridis, Sergios ; Mavroforakis, Charalampos ; Evaggelatou-Dalla, Leoni</creatorcontrib><description>The paper presents a new framework for complex support vector regression (SVR) as well as Support Vector Machines (SVM) for quaternary classification. The method exploits the notion of widely linear estimation to model the input-out relation for complex-valued data and considers two cases: 1) the complex data are split into their real and imaginary parts and a typical real kernel is employed to map the complex data to a complexified feature space and 2) a pure complex kernel is used to directly map the data to the induced complex feature space. The recently developed Wirtinger's calculus on complex reproducing kernel Hilbert spaces is employed to compute the Lagrangian and derive the dual optimization problem. As one of our major results, we prove that any complex SVM/SVR task is equivalent with solving two real SVM/SVR tasks exploiting a specific real kernel, which is generated by the chosen complex kernel. In particular, the case of pure complex kernels leads to the generation of new kernels, which have not been considered before. In the classification case, the proposed framework inherently splits the complex space into four parts. This leads naturally to solving the four class-task (quaternary classification), instead of the typical two classes of the real SVM. In turn, this rationale can be used in a multiclass problem as a split-class scenario based on four classes, as opposed to the one-versus-all method; this can lead to significant computational savings. Experiments demonstrate the effectiveness of the proposed framework for regression and classification tasks that involve complex data.</description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2014.2336679</identifier><identifier>PMID: 25095266</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Calculus ; Classification ; complex kernels ; complex valued data ; Estimation ; Hilbert space ; Kernel ; regression ; Support vector machines ; Vectors ; widely linear estimation</subject><ispartof>IEEE transaction on neural networks and learning systems, 2015-06, Vol.26 (6), p.1260-1274</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c578t-6624bb0058ace2c135a8dc7acb261271680f1a96c565cc90e7bbb259eca974813</citedby><cites>FETCH-LOGICAL-c578t-6624bb0058ace2c135a8dc7acb261271680f1a96c565cc90e7bbb259eca974813</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6868310$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/25095266$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Bouboulis, Pantelis</creatorcontrib><creatorcontrib>Theodoridis, Sergios</creatorcontrib><creatorcontrib>Mavroforakis, Charalampos</creatorcontrib><creatorcontrib>Evaggelatou-Dalla, Leoni</creatorcontrib><title>Complex Support Vector Machines for Regression and Quaternary Classification</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>The paper presents a new framework for complex support vector regression (SVR) as well as Support Vector Machines (SVM) for quaternary classification. The method exploits the notion of widely linear estimation to model the input-out relation for complex-valued data and considers two cases: 1) the complex data are split into their real and imaginary parts and a typical real kernel is employed to map the complex data to a complexified feature space and 2) a pure complex kernel is used to directly map the data to the induced complex feature space. The recently developed Wirtinger's calculus on complex reproducing kernel Hilbert spaces is employed to compute the Lagrangian and derive the dual optimization problem. As one of our major results, we prove that any complex SVM/SVR task is equivalent with solving two real SVM/SVR tasks exploiting a specific real kernel, which is generated by the chosen complex kernel. In particular, the case of pure complex kernels leads to the generation of new kernels, which have not been considered before. In the classification case, the proposed framework inherently splits the complex space into four parts. This leads naturally to solving the four class-task (quaternary classification), instead of the typical two classes of the real SVM. In turn, this rationale can be used in a multiclass problem as a split-class scenario based on four classes, as opposed to the one-versus-all method; this can lead to significant computational savings. Experiments demonstrate the effectiveness of the proposed framework for regression and classification tasks that involve complex data.</description><subject>Calculus</subject><subject>Classification</subject><subject>complex kernels</subject><subject>complex valued data</subject><subject>Estimation</subject><subject>Hilbert space</subject><subject>Kernel</subject><subject>regression</subject><subject>Support vector machines</subject><subject>Vectors</subject><subject>widely linear estimation</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2015</creationdate><recordtype>article</recordtype><recordid>eNo9kMlOwzAQhi0EolXpC4CEcuSSYjvxdkQVmxSKoAVxsxx3AkHZsBMJ3h6Xls5ltn9-jT6ETgmeEYLV5WqxyJYzikk6o0nCuVAHaEwJpzFNpDzc1-JthKbef-IQHDOeqmM0ogwrRjkfo2ze1l0F39Fy6LrW9dEr2L510YOxH2UDPipC8wzvDrwv2yYyzTp6GkwPrjHuJ5pXJsyL0po-bE_QUWEqD9NdnqCXm-vV_C7OHm_v51dZbJmQfcw5TfMcYyaNBWpJwoxcW2FsTjmhgnCJC2IUt4wzaxUGkec5ZQqsUSKVJJmgi61v59qvAXyv69JbqCrTQDt4HRyolEIRFqR0K7Wu9d5BoTtX1uF1TbDegNR_IPUGpN6BDEfnO_8hr2G9P_nHFgRnW0EJAPs1l1wmBCe_ce53jg</recordid><startdate>20150601</startdate><enddate>20150601</enddate><creator>Bouboulis, Pantelis</creator><creator>Theodoridis, Sergios</creator><creator>Mavroforakis, Charalampos</creator><creator>Evaggelatou-Dalla, Leoni</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>20150601</creationdate><title>Complex Support Vector Machines for Regression and Quaternary Classification</title><author>Bouboulis, Pantelis ; Theodoridis, Sergios ; Mavroforakis, Charalampos ; Evaggelatou-Dalla, Leoni</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c578t-6624bb0058ace2c135a8dc7acb261271680f1a96c565cc90e7bbb259eca974813</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Calculus</topic><topic>Classification</topic><topic>complex kernels</topic><topic>complex valued data</topic><topic>Estimation</topic><topic>Hilbert space</topic><topic>Kernel</topic><topic>regression</topic><topic>Support vector machines</topic><topic>Vectors</topic><topic>widely linear estimation</topic><toplevel>online_resources</toplevel><creatorcontrib>Bouboulis, Pantelis</creatorcontrib><creatorcontrib>Theodoridis, Sergios</creatorcontrib><creatorcontrib>Mavroforakis, Charalampos</creatorcontrib><creatorcontrib>Evaggelatou-Dalla, Leoni</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore (Online service)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bouboulis, Pantelis</au><au>Theodoridis, Sergios</au><au>Mavroforakis, Charalampos</au><au>Evaggelatou-Dalla, Leoni</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Complex Support Vector Machines for Regression and Quaternary Classification</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2015-06-01</date><risdate>2015</risdate><volume>26</volume><issue>6</issue><spage>1260</spage><epage>1274</epage><pages>1260-1274</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>The paper presents a new framework for complex support vector regression (SVR) as well as Support Vector Machines (SVM) for quaternary classification. The method exploits the notion of widely linear estimation to model the input-out relation for complex-valued data and considers two cases: 1) the complex data are split into their real and imaginary parts and a typical real kernel is employed to map the complex data to a complexified feature space and 2) a pure complex kernel is used to directly map the data to the induced complex feature space. The recently developed Wirtinger's calculus on complex reproducing kernel Hilbert spaces is employed to compute the Lagrangian and derive the dual optimization problem. As one of our major results, we prove that any complex SVM/SVR task is equivalent with solving two real SVM/SVR tasks exploiting a specific real kernel, which is generated by the chosen complex kernel. In particular, the case of pure complex kernels leads to the generation of new kernels, which have not been considered before. In the classification case, the proposed framework inherently splits the complex space into four parts. This leads naturally to solving the four class-task (quaternary classification), instead of the typical two classes of the real SVM. In turn, this rationale can be used in a multiclass problem as a split-class scenario based on four classes, as opposed to the one-versus-all method; this can lead to significant computational savings. Experiments demonstrate the effectiveness of the proposed framework for regression and classification tasks that involve complex data.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>25095266</pmid><doi>10.1109/TNNLS.2014.2336679</doi><tpages>15</tpages></addata></record>
fulltext fulltext
identifier ISSN: 2162-237X
ispartof IEEE transaction on neural networks and learning systems, 2015-06, Vol.26 (6), p.1260-1274
issn 2162-237X
2162-2388
language eng
recordid cdi_ieee_primary_6868310
source IEEE Xplore (Online service)
subjects Calculus
Classification
complex kernels
complex valued data
Estimation
Hilbert space
Kernel
regression
Support vector machines
Vectors
widely linear estimation
title Complex Support Vector Machines for Regression and Quaternary Classification
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T12%3A12%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Complex%20Support%20Vector%20Machines%20for%20Regression%20and%20Quaternary%20Classification&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Bouboulis,%20Pantelis&rft.date=2015-06-01&rft.volume=26&rft.issue=6&rft.spage=1260&rft.epage=1274&rft.pages=1260-1274&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2014.2336679&rft_dat=%3Cproquest_ieee_%3E1682887915%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c578t-6624bb0058ace2c135a8dc7acb261271680f1a96c565cc90e7bbb259eca974813%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1682887915&rft_id=info:pmid/25095266&rft_ieee_id=6868310&rfr_iscdi=true