Loading…

Generalized Convolution Spectral Mixture for Multitask Gaussian Processes

Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitati...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2020-12, Vol.31 (12), p.5613-5623
Main Authors: Chen, Kai, van Laarhoven, Twan, Groot, Perry, Chen, Jinsong, Marchiori, Elena
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c395t-bc1c568fe7ae7a9b1b20f7187b5d3f3594112486d8612419f3631b1ddbe380d03
cites cdi_FETCH-LOGICAL-c395t-bc1c568fe7ae7a9b1b20f7187b5d3f3594112486d8612419f3631b1ddbe380d03
container_end_page 5623
container_issue 12
container_start_page 5613
container_title IEEE transaction on neural networks and learning systems
container_volume 31
creator Chen, Kai
van Laarhoven, Twan
Groot, Perry
Chen, Jinsong
Marchiori, Elena
description Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitation. We focus on spectral mixture (SM) kernels and propose an enhancement of this type of kernels, called multitask generalized convolution SM (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and dependence between components, including time and phase delay dependence. Each task in MT-GCSM has its GCSM kernel with its number of convolution structures, and dependencies between all components from different tasks are considered. Another constraint of current kernels for MTGPs is that components from different tasks are aligned. Here, we lift this constraint by using inner and outer full cross convolution between a base component and the reversed complex conjugate of another base component. Extensive experiments on two synthetic and three real-life data sets illustrate the difference between MT-GCSM and previous SM kernels as well as the practical effectiveness of MT-GCSM.
doi_str_mv 10.1109/TNNLS.2020.2980779
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmed_primary_32305940</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9068485</ieee_id><sourcerecordid>2467298463</sourcerecordid><originalsourceid>FETCH-LOGICAL-c395t-bc1c568fe7ae7a9b1b20f7187b5d3f3594112486d8612419f3631b1ddbe380d03</originalsourceid><addsrcrecordid>eNpdkEtLAzEQgIMottT-AQVZ8OJlax672eQoRWuhVaEVvIV9zELqdlOTjai_3tTWHhwGZmC-GYYPoXOCR4RgebN8fJwtRhRTPKJS4CyTR6hPCacxZUIcH_rstYeGzq1wCI5TnshT1GOU4VQmuI-mE2jB5o3-hioam_bDNL7Tpo0WGyi7MIjm-rPzFqLa2Gjum053uXuLJrl3Tudt9GxNCc6BO0Mndd44GO7rAL3c3y3HD_HsaTId387iksm0i4uSlCkXNWR5SFmQguI6IyIr0orVLHxFCE0ErwQPlciacUYKUlUFMIErzAboend3Y827B9eptXYlNE3egvFOUSZpknEst-jVP3RlvG3Dd4omPAveEs4CRXdUaY1zFmq1sXqd2y9FsNq6Vr-u1da12rsOS5f7075YQ3VY-TMbgIsdoAHgMJaYi0Sk7AeXFoIB</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2467298463</pqid></control><display><type>article</type><title>Generalized Convolution Spectral Mixture for Multitask Gaussian Processes</title><source>IEEE Xplore (Online service)</source><creator>Chen, Kai ; van Laarhoven, Twan ; Groot, Perry ; Chen, Jinsong ; Marchiori, Elena</creator><creatorcontrib>Chen, Kai ; van Laarhoven, Twan ; Groot, Perry ; Chen, Jinsong ; Marchiori, Elena</creatorcontrib><description>Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitation. We focus on spectral mixture (SM) kernels and propose an enhancement of this type of kernels, called multitask generalized convolution SM (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and dependence between components, including time and phase delay dependence. Each task in MT-GCSM has its GCSM kernel with its number of convolution structures, and dependencies between all components from different tasks are considered. Another constraint of current kernels for MTGPs is that components from different tasks are aligned. Here, we lift this constraint by using inner and outer full cross convolution between a base component and the reversed complex conjugate of another base component. Extensive experiments on two synthetic and three real-life data sets illustrate the difference between MT-GCSM and previous SM kernels as well as the practical effectiveness of MT-GCSM.</description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2020.2980779</identifier><identifier>PMID: 32305940</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Convolution ; Correlation ; Cross convolution ; Dependence ; Frequency-domain analysis ; Gaussian process ; Gaussian processes ; Gaussian processes (GPs) ; Kernel ; Kernels ; multitask learning ; spectral mixture (SM) ; Task analysis ; task dependencies</subject><ispartof>IEEE transaction on neural networks and learning systems, 2020-12, Vol.31 (12), p.5613-5623</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c395t-bc1c568fe7ae7a9b1b20f7187b5d3f3594112486d8612419f3631b1ddbe380d03</citedby><cites>FETCH-LOGICAL-c395t-bc1c568fe7ae7a9b1b20f7187b5d3f3594112486d8612419f3631b1ddbe380d03</cites><orcidid>0000-0002-4081-0687</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9068485$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32305940$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Chen, Kai</creatorcontrib><creatorcontrib>van Laarhoven, Twan</creatorcontrib><creatorcontrib>Groot, Perry</creatorcontrib><creatorcontrib>Chen, Jinsong</creatorcontrib><creatorcontrib>Marchiori, Elena</creatorcontrib><title>Generalized Convolution Spectral Mixture for Multitask Gaussian Processes</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitation. We focus on spectral mixture (SM) kernels and propose an enhancement of this type of kernels, called multitask generalized convolution SM (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and dependence between components, including time and phase delay dependence. Each task in MT-GCSM has its GCSM kernel with its number of convolution structures, and dependencies between all components from different tasks are considered. Another constraint of current kernels for MTGPs is that components from different tasks are aligned. Here, we lift this constraint by using inner and outer full cross convolution between a base component and the reversed complex conjugate of another base component. Extensive experiments on two synthetic and three real-life data sets illustrate the difference between MT-GCSM and previous SM kernels as well as the practical effectiveness of MT-GCSM.</description><subject>Convolution</subject><subject>Correlation</subject><subject>Cross convolution</subject><subject>Dependence</subject><subject>Frequency-domain analysis</subject><subject>Gaussian process</subject><subject>Gaussian processes</subject><subject>Gaussian processes (GPs)</subject><subject>Kernel</subject><subject>Kernels</subject><subject>multitask learning</subject><subject>spectral mixture (SM)</subject><subject>Task analysis</subject><subject>task dependencies</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNpdkEtLAzEQgIMottT-AQVZ8OJlax672eQoRWuhVaEVvIV9zELqdlOTjai_3tTWHhwGZmC-GYYPoXOCR4RgebN8fJwtRhRTPKJS4CyTR6hPCacxZUIcH_rstYeGzq1wCI5TnshT1GOU4VQmuI-mE2jB5o3-hioam_bDNL7Tpo0WGyi7MIjm-rPzFqLa2Gjum053uXuLJrl3Tudt9GxNCc6BO0Mndd44GO7rAL3c3y3HD_HsaTId387iksm0i4uSlCkXNWR5SFmQguI6IyIr0orVLHxFCE0ErwQPlciacUYKUlUFMIErzAboend3Y827B9eptXYlNE3egvFOUSZpknEst-jVP3RlvG3Dd4omPAveEs4CRXdUaY1zFmq1sXqd2y9FsNq6Vr-u1da12rsOS5f7075YQ3VY-TMbgIsdoAHgMJaYi0Sk7AeXFoIB</recordid><startdate>20201201</startdate><enddate>20201201</enddate><creator>Chen, Kai</creator><creator>van Laarhoven, Twan</creator><creator>Groot, Perry</creator><creator>Chen, Jinsong</creator><creator>Marchiori, Elena</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-4081-0687</orcidid></search><sort><creationdate>20201201</creationdate><title>Generalized Convolution Spectral Mixture for Multitask Gaussian Processes</title><author>Chen, Kai ; van Laarhoven, Twan ; Groot, Perry ; Chen, Jinsong ; Marchiori, Elena</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c395t-bc1c568fe7ae7a9b1b20f7187b5d3f3594112486d8612419f3631b1ddbe380d03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Convolution</topic><topic>Correlation</topic><topic>Cross convolution</topic><topic>Dependence</topic><topic>Frequency-domain analysis</topic><topic>Gaussian process</topic><topic>Gaussian processes</topic><topic>Gaussian processes (GPs)</topic><topic>Kernel</topic><topic>Kernels</topic><topic>multitask learning</topic><topic>spectral mixture (SM)</topic><topic>Task analysis</topic><topic>task dependencies</topic><toplevel>online_resources</toplevel><creatorcontrib>Chen, Kai</creatorcontrib><creatorcontrib>van Laarhoven, Twan</creatorcontrib><creatorcontrib>Groot, Perry</creatorcontrib><creatorcontrib>Chen, Jinsong</creatorcontrib><creatorcontrib>Marchiori, Elena</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005–Present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998–Present</collection><collection>IEEE Electronic Library Online</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chen, Kai</au><au>van Laarhoven, Twan</au><au>Groot, Perry</au><au>Chen, Jinsong</au><au>Marchiori, Elena</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Generalized Convolution Spectral Mixture for Multitask Gaussian Processes</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2020-12-01</date><risdate>2020</risdate><volume>31</volume><issue>12</issue><spage>5613</spage><epage>5623</epage><pages>5613-5623</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>Multitask Gaussian processes (MTGPs) are a powerful approach for modeling dependencies between multiple related tasks or functions for joint regression. Current kernels for MTGPs cannot fully model nonlinear task correlations and other types of dependencies. In this article, we address this limitation. We focus on spectral mixture (SM) kernels and propose an enhancement of this type of kernels, called multitask generalized convolution SM (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and dependence between components, including time and phase delay dependence. Each task in MT-GCSM has its GCSM kernel with its number of convolution structures, and dependencies between all components from different tasks are considered. Another constraint of current kernels for MTGPs is that components from different tasks are aligned. Here, we lift this constraint by using inner and outer full cross convolution between a base component and the reversed complex conjugate of another base component. Extensive experiments on two synthetic and three real-life data sets illustrate the difference between MT-GCSM and previous SM kernels as well as the practical effectiveness of MT-GCSM.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>32305940</pmid><doi>10.1109/TNNLS.2020.2980779</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0002-4081-0687</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2162-237X
ispartof IEEE transaction on neural networks and learning systems, 2020-12, Vol.31 (12), p.5613-5623
issn 2162-237X
2162-2388
language eng
recordid cdi_pubmed_primary_32305940
source IEEE Xplore (Online service)
subjects Convolution
Correlation
Cross convolution
Dependence
Frequency-domain analysis
Gaussian process
Gaussian processes
Gaussian processes (GPs)
Kernel
Kernels
multitask learning
spectral mixture (SM)
Task analysis
task dependencies
title Generalized Convolution Spectral Mixture for Multitask Gaussian Processes
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T22%3A06%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Generalized%20Convolution%20Spectral%20Mixture%20for%20Multitask%20Gaussian%20Processes&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Chen,%20Kai&rft.date=2020-12-01&rft.volume=31&rft.issue=12&rft.spage=5613&rft.epage=5623&rft.pages=5613-5623&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2020.2980779&rft_dat=%3Cproquest_pubme%3E2467298463%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c395t-bc1c568fe7ae7a9b1b20f7187b5d3f3594112486d8612419f3631b1ddbe380d03%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2467298463&rft_id=info:pmid/32305940&rft_ieee_id=9068485&rfr_iscdi=true