Loading…

Heterogeneous Representation Learning with Structured Sparsity Regularization

Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the co-existence of multiple types of heterogeneity. In this paper, we propose a HEterogeneous REpresentation learning model with structured Sparsity regularization (HERES) to learn...

Full description

Saved in:
Bibliographic Details
Main Authors: Pei Yang, Jingrui He
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 548
container_issue
container_start_page 539
container_title
container_volume
creator Pei Yang
Jingrui He
description Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the co-existence of multiple types of heterogeneity. In this paper, we propose a HEterogeneous REpresentation learning model with structured Sparsity regularization (HERES) to learn from multiple types of heterogeneity. HERES aims to leverage two kinds of information to build a robust learning system. One is the rich correlations among heterogeneous data such as task relatedness, view consistency, and label correlation. The other is the prior knowledge of the data in the form of, e.g., the soft-clustering of the tasks. HERES is a generic framework for heterogeneous learning, which integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning. The objective of HERES is to minimize the reconstruction loss of using the factor matrices to recover the input matrix for heterogeneous data, regularized by the structured sparsity constraint. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We develop an iterative updating method to solve the problem. Furthermore, we prove that the reformulation of structured sparsity is separable, which leads to a family of efficient and scalable algorithms for solving structured sparsity penalized problems. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.
doi_str_mv 10.1109/ICDM.2016.0065
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_7837878</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>7837878</ieee_id><sourcerecordid>7837878</sourcerecordid><originalsourceid>FETCH-ieee_primary_78378783</originalsourceid><addsrcrecordid>eNp9i7FuwjAUAN1KSEBhZWHxDxCe48R25tAKJLI07Miir8GIOtGzIxS-vqhi7nI3nI6xhYBECCjWu3JTJSkIlQCo_IXNC21EDgXkmZbpK5ukUmcrkxk1ZtMQLgBSKQkTVm0xIrUNemz7wD-xIwzoo42u9XyPlrzzDb-5eOZ1pP4Ue8IvXneWgovDY2j6qyV3_xtmbPRtrwHnT7-x5cf7odyuHCIeO3I_loajNlKbB_6vv8pAQHA</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Heterogeneous Representation Learning with Structured Sparsity Regularization</title><source>IEEE Xplore All Conference Series</source><creator>Pei Yang ; Jingrui He</creator><creatorcontrib>Pei Yang ; Jingrui He</creatorcontrib><description>Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the co-existence of multiple types of heterogeneity. In this paper, we propose a HEterogeneous REpresentation learning model with structured Sparsity regularization (HERES) to learn from multiple types of heterogeneity. HERES aims to leverage two kinds of information to build a robust learning system. One is the rich correlations among heterogeneous data such as task relatedness, view consistency, and label correlation. The other is the prior knowledge of the data in the form of, e.g., the soft-clustering of the tasks. HERES is a generic framework for heterogeneous learning, which integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning. The objective of HERES is to minimize the reconstruction loss of using the factor matrices to recover the input matrix for heterogeneous data, regularized by the structured sparsity constraint. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We develop an iterative updating method to solve the problem. Furthermore, we prove that the reformulation of structured sparsity is separable, which leads to a family of efficient and scalable algorithms for solving structured sparsity penalized problems. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.</description><identifier>EISSN: 2374-8486</identifier><identifier>EISBN: 9781509054732</identifier><identifier>EISBN: 1509054731</identifier><identifier>DOI: 10.1109/ICDM.2016.0065</identifier><identifier>CODEN: IEEPAD</identifier><language>eng</language><publisher>IEEE</publisher><subject>Correlation ; Data models ; Encoding ; Heterogeneous learning ; Learning systems ; Matrix decomposition ; multi-label learning ; multi-task learning ; multi-view learning ; Optimization ; Robustness</subject><ispartof>2016 IEEE 16th International Conference on Data Mining (ICDM), 2016, p.539-548</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/7837878$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,27902,54530,54907</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/7837878$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Pei Yang</creatorcontrib><creatorcontrib>Jingrui He</creatorcontrib><title>Heterogeneous Representation Learning with Structured Sparsity Regularization</title><title>2016 IEEE 16th International Conference on Data Mining (ICDM)</title><addtitle>ICDM</addtitle><description>Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the co-existence of multiple types of heterogeneity. In this paper, we propose a HEterogeneous REpresentation learning model with structured Sparsity regularization (HERES) to learn from multiple types of heterogeneity. HERES aims to leverage two kinds of information to build a robust learning system. One is the rich correlations among heterogeneous data such as task relatedness, view consistency, and label correlation. The other is the prior knowledge of the data in the form of, e.g., the soft-clustering of the tasks. HERES is a generic framework for heterogeneous learning, which integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning. The objective of HERES is to minimize the reconstruction loss of using the factor matrices to recover the input matrix for heterogeneous data, regularized by the structured sparsity constraint. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We develop an iterative updating method to solve the problem. Furthermore, we prove that the reformulation of structured sparsity is separable, which leads to a family of efficient and scalable algorithms for solving structured sparsity penalized problems. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.</description><subject>Correlation</subject><subject>Data models</subject><subject>Encoding</subject><subject>Heterogeneous learning</subject><subject>Learning systems</subject><subject>Matrix decomposition</subject><subject>multi-label learning</subject><subject>multi-task learning</subject><subject>multi-view learning</subject><subject>Optimization</subject><subject>Robustness</subject><issn>2374-8486</issn><isbn>9781509054732</isbn><isbn>1509054731</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2016</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNp9i7FuwjAUAN1KSEBhZWHxDxCe48R25tAKJLI07Miir8GIOtGzIxS-vqhi7nI3nI6xhYBECCjWu3JTJSkIlQCo_IXNC21EDgXkmZbpK5ukUmcrkxk1ZtMQLgBSKQkTVm0xIrUNemz7wD-xIwzoo42u9XyPlrzzDb-5eOZ1pP4Ue8IvXneWgovDY2j6qyV3_xtmbPRtrwHnT7-x5cf7odyuHCIeO3I_loajNlKbB_6vv8pAQHA</recordid><startdate>201612</startdate><enddate>201612</enddate><creator>Pei Yang</creator><creator>Jingrui He</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201612</creationdate><title>Heterogeneous Representation Learning with Structured Sparsity Regularization</title><author>Pei Yang ; Jingrui He</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-ieee_primary_78378783</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2016</creationdate><topic>Correlation</topic><topic>Data models</topic><topic>Encoding</topic><topic>Heterogeneous learning</topic><topic>Learning systems</topic><topic>Matrix decomposition</topic><topic>multi-label learning</topic><topic>multi-task learning</topic><topic>multi-view learning</topic><topic>Optimization</topic><topic>Robustness</topic><toplevel>online_resources</toplevel><creatorcontrib>Pei Yang</creatorcontrib><creatorcontrib>Jingrui He</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Pei Yang</au><au>Jingrui He</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Heterogeneous Representation Learning with Structured Sparsity Regularization</atitle><btitle>2016 IEEE 16th International Conference on Data Mining (ICDM)</btitle><stitle>ICDM</stitle><date>2016-12</date><risdate>2016</risdate><spage>539</spage><epage>548</epage><pages>539-548</pages><eissn>2374-8486</eissn><eisbn>9781509054732</eisbn><eisbn>1509054731</eisbn><coden>IEEPAD</coden><abstract>Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the co-existence of multiple types of heterogeneity. In this paper, we propose a HEterogeneous REpresentation learning model with structured Sparsity regularization (HERES) to learn from multiple types of heterogeneity. HERES aims to leverage two kinds of information to build a robust learning system. One is the rich correlations among heterogeneous data such as task relatedness, view consistency, and label correlation. The other is the prior knowledge of the data in the form of, e.g., the soft-clustering of the tasks. HERES is a generic framework for heterogeneous learning, which integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning. The objective of HERES is to minimize the reconstruction loss of using the factor matrices to recover the input matrix for heterogeneous data, regularized by the structured sparsity constraint. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We develop an iterative updating method to solve the problem. Furthermore, we prove that the reformulation of structured sparsity is separable, which leads to a family of efficient and scalable algorithms for solving structured sparsity penalized problems. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.</abstract><pub>IEEE</pub><doi>10.1109/ICDM.2016.0065</doi></addata></record>
fulltext fulltext_linktorsrc
identifier EISSN: 2374-8486
ispartof 2016 IEEE 16th International Conference on Data Mining (ICDM), 2016, p.539-548
issn 2374-8486
language eng
recordid cdi_ieee_primary_7837878
source IEEE Xplore All Conference Series
subjects Correlation
Data models
Encoding
Heterogeneous learning
Learning systems
Matrix decomposition
multi-label learning
multi-task learning
multi-view learning
Optimization
Robustness
title Heterogeneous Representation Learning with Structured Sparsity Regularization
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T13%3A01%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Heterogeneous%20Representation%20Learning%20with%20Structured%20Sparsity%20Regularization&rft.btitle=2016%20IEEE%2016th%20International%20Conference%20on%20Data%20Mining%20(ICDM)&rft.au=Pei%20Yang&rft.date=2016-12&rft.spage=539&rft.epage=548&rft.pages=539-548&rft.eissn=2374-8486&rft.coden=IEEPAD&rft_id=info:doi/10.1109/ICDM.2016.0065&rft.eisbn=9781509054732&rft.eisbn_list=1509054731&rft_dat=%3Cieee_CHZPO%3E7837878%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-ieee_primary_78378783%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=7837878&rfr_iscdi=true