Loading…
Blockwise coordinate descent schemes for sparse representation
The current sparse representation framework is to decouple it as two subproblems, i.e., alternate sparse coding and dictionary learning using different optimizers, treating elements in bases and codes separately. In this paper, we treat elements both in bases and codes ho-mogenously. The original op...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c225t-259368ffbe0e06e09152086510eba968fb7058fbc2797cd9b8ada9478fa5a6e43 |
---|---|
cites | |
container_end_page | 5271 |
container_issue | |
container_start_page | 5267 |
container_title | |
container_volume | |
creator | Bao-Di Liu Yu-Xiong Wang Bin Shen Yu-Jin Zhang Yan-Jiang Wang |
description | The current sparse representation framework is to decouple it as two subproblems, i.e., alternate sparse coding and dictionary learning using different optimizers, treating elements in bases and codes separately. In this paper, we treat elements both in bases and codes ho-mogenously. The original optimization is directly decoupled as several blockwise alternate subproblems rather than above two. Hence, sparse coding and bases learning optimizations are coupled together. And the variables involved in the optimization problems are partitioned into several suitable blocks with convexity preserved, making it possible to perform an exact block coordinate descent. For each separable subproblem, based on the convexity and monotonic property of the parabolic function, a closed-form solution is obtained. Thus the algorithm is simple, efficient and effective. Experimental results show that our algorithm significantly accelerates the learning process. |
doi_str_mv | 10.1109/ICASSP.2014.6854608 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_6854608</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6854608</ieee_id><sourcerecordid>6854608</sourcerecordid><originalsourceid>FETCH-LOGICAL-c225t-259368ffbe0e06e09152086510eba968fb7058fbc2797cd9b8ada9478fa5a6e43</originalsourceid><addsrcrecordid>eNotj91Kw0AUhFdRMK19gt7kBRLPbvbv3AharAqFClXwrmw2Jxhts2E3IL69EXszczHD8A1jSw4l54A3z6u73e6lFMBlqa2SGuwZW6CxXBpEYbGS5ywTlcGCI7xfsIwrAYXmEq_YLKVPALBG2ozd3h-C__ruEuU-hNh0vRspbyh56sc8-Q86UsrbEPM0uDi1Ig2R0hS6sQv9Nbts3SHR4uRz9rZ-eF09FZvt4wS5KbwQaiyEwkrbtq0JCDQB_uFYrThQ7XBKagNqUi8MGt9gbV3jUBrbOuU0yWrOlv-7HRHth9gdXfzZn65Xv5C0TF8</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Blockwise coordinate descent schemes for sparse representation</title><source>IEEE Xplore All Conference Series</source><creator>Bao-Di Liu ; Yu-Xiong Wang ; Bin Shen ; Yu-Jin Zhang ; Yan-Jiang Wang</creator><creatorcontrib>Bao-Di Liu ; Yu-Xiong Wang ; Bin Shen ; Yu-Jin Zhang ; Yan-Jiang Wang</creatorcontrib><description>The current sparse representation framework is to decouple it as two subproblems, i.e., alternate sparse coding and dictionary learning using different optimizers, treating elements in bases and codes separately. In this paper, we treat elements both in bases and codes ho-mogenously. The original optimization is directly decoupled as several blockwise alternate subproblems rather than above two. Hence, sparse coding and bases learning optimizations are coupled together. And the variables involved in the optimization problems are partitioned into several suitable blocks with convexity preserved, making it possible to perform an exact block coordinate descent. For each separable subproblem, based on the convexity and monotonic property of the parabolic function, a closed-form solution is obtained. Thus the algorithm is simple, efficient and effective. Experimental results show that our algorithm significantly accelerates the learning process.</description><identifier>ISSN: 1520-6149</identifier><identifier>EISSN: 2379-190X</identifier><identifier>EISBN: 9781479928934</identifier><identifier>EISBN: 1479928933</identifier><identifier>DOI: 10.1109/ICASSP.2014.6854608</identifier><language>eng</language><publisher>IEEE</publisher><subject>Convergence ; coordinate descent ; Dictionaries ; Dictionary learning ; Encoding ; Linear programming ; Minimization ; Optimization ; sparse coding ; Sparse matrices</subject><ispartof>2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2014, p.5267-5271</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c225t-259368ffbe0e06e09152086510eba968fb7058fbc2797cd9b8ada9478fa5a6e43</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6854608$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6854608$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Bao-Di Liu</creatorcontrib><creatorcontrib>Yu-Xiong Wang</creatorcontrib><creatorcontrib>Bin Shen</creatorcontrib><creatorcontrib>Yu-Jin Zhang</creatorcontrib><creatorcontrib>Yan-Jiang Wang</creatorcontrib><title>Blockwise coordinate descent schemes for sparse representation</title><title>2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</title><addtitle>ICASSP</addtitle><description>The current sparse representation framework is to decouple it as two subproblems, i.e., alternate sparse coding and dictionary learning using different optimizers, treating elements in bases and codes separately. In this paper, we treat elements both in bases and codes ho-mogenously. The original optimization is directly decoupled as several blockwise alternate subproblems rather than above two. Hence, sparse coding and bases learning optimizations are coupled together. And the variables involved in the optimization problems are partitioned into several suitable blocks with convexity preserved, making it possible to perform an exact block coordinate descent. For each separable subproblem, based on the convexity and monotonic property of the parabolic function, a closed-form solution is obtained. Thus the algorithm is simple, efficient and effective. Experimental results show that our algorithm significantly accelerates the learning process.</description><subject>Convergence</subject><subject>coordinate descent</subject><subject>Dictionaries</subject><subject>Dictionary learning</subject><subject>Encoding</subject><subject>Linear programming</subject><subject>Minimization</subject><subject>Optimization</subject><subject>sparse coding</subject><subject>Sparse matrices</subject><issn>1520-6149</issn><issn>2379-190X</issn><isbn>9781479928934</isbn><isbn>1479928933</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2014</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotj91Kw0AUhFdRMK19gt7kBRLPbvbv3AharAqFClXwrmw2Jxhts2E3IL69EXszczHD8A1jSw4l54A3z6u73e6lFMBlqa2SGuwZW6CxXBpEYbGS5ywTlcGCI7xfsIwrAYXmEq_YLKVPALBG2ozd3h-C__ruEuU-hNh0vRspbyh56sc8-Q86UsrbEPM0uDi1Ig2R0hS6sQv9Nbts3SHR4uRz9rZ-eF09FZvt4wS5KbwQaiyEwkrbtq0JCDQB_uFYrThQ7XBKagNqUi8MGt9gbV3jUBrbOuU0yWrOlv-7HRHth9gdXfzZn65Xv5C0TF8</recordid><startdate>201405</startdate><enddate>201405</enddate><creator>Bao-Di Liu</creator><creator>Yu-Xiong Wang</creator><creator>Bin Shen</creator><creator>Yu-Jin Zhang</creator><creator>Yan-Jiang Wang</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>201405</creationdate><title>Blockwise coordinate descent schemes for sparse representation</title><author>Bao-Di Liu ; Yu-Xiong Wang ; Bin Shen ; Yu-Jin Zhang ; Yan-Jiang Wang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c225t-259368ffbe0e06e09152086510eba968fb7058fbc2797cd9b8ada9478fa5a6e43</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Convergence</topic><topic>coordinate descent</topic><topic>Dictionaries</topic><topic>Dictionary learning</topic><topic>Encoding</topic><topic>Linear programming</topic><topic>Minimization</topic><topic>Optimization</topic><topic>sparse coding</topic><topic>Sparse matrices</topic><toplevel>online_resources</toplevel><creatorcontrib>Bao-Di Liu</creatorcontrib><creatorcontrib>Yu-Xiong Wang</creatorcontrib><creatorcontrib>Bin Shen</creatorcontrib><creatorcontrib>Yu-Jin Zhang</creatorcontrib><creatorcontrib>Yan-Jiang Wang</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Bao-Di Liu</au><au>Yu-Xiong Wang</au><au>Bin Shen</au><au>Yu-Jin Zhang</au><au>Yan-Jiang Wang</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Blockwise coordinate descent schemes for sparse representation</atitle><btitle>2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</btitle><stitle>ICASSP</stitle><date>2014-05</date><risdate>2014</risdate><spage>5267</spage><epage>5271</epage><pages>5267-5271</pages><issn>1520-6149</issn><eissn>2379-190X</eissn><eisbn>9781479928934</eisbn><eisbn>1479928933</eisbn><abstract>The current sparse representation framework is to decouple it as two subproblems, i.e., alternate sparse coding and dictionary learning using different optimizers, treating elements in bases and codes separately. In this paper, we treat elements both in bases and codes ho-mogenously. The original optimization is directly decoupled as several blockwise alternate subproblems rather than above two. Hence, sparse coding and bases learning optimizations are coupled together. And the variables involved in the optimization problems are partitioned into several suitable blocks with convexity preserved, making it possible to perform an exact block coordinate descent. For each separable subproblem, based on the convexity and monotonic property of the parabolic function, a closed-form solution is obtained. Thus the algorithm is simple, efficient and effective. Experimental results show that our algorithm significantly accelerates the learning process.</abstract><pub>IEEE</pub><doi>10.1109/ICASSP.2014.6854608</doi><tpages>5</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1520-6149 |
ispartof | 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2014, p.5267-5271 |
issn | 1520-6149 2379-190X |
language | eng |
recordid | cdi_ieee_primary_6854608 |
source | IEEE Xplore All Conference Series |
subjects | Convergence coordinate descent Dictionaries Dictionary learning Encoding Linear programming Minimization Optimization sparse coding Sparse matrices |
title | Blockwise coordinate descent schemes for sparse representation |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T00%3A07%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Blockwise%20coordinate%20descent%20schemes%20for%20sparse%20representation&rft.btitle=2014%20IEEE%20International%20Conference%20on%20Acoustics,%20Speech%20and%20Signal%20Processing%20(ICASSP)&rft.au=Bao-Di%20Liu&rft.date=2014-05&rft.spage=5267&rft.epage=5271&rft.pages=5267-5271&rft.issn=1520-6149&rft.eissn=2379-190X&rft_id=info:doi/10.1109/ICASSP.2014.6854608&rft.eisbn=9781479928934&rft.eisbn_list=1479928933&rft_dat=%3Cieee_CHZPO%3E6854608%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c225t-259368ffbe0e06e09152086510eba968fb7058fbc2797cd9b8ada9478fa5a6e43%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6854608&rfr_iscdi=true |