Loading…
Linearly convergent away-step conditional gradient for non-strongly convex functions
We consider the problem of minimizing the sum of a linear function and a composition of a strongly convex function with a linear transformation over a compact polyhedral set. Jaggi and Lacoste-Julien (An affine invariant linear convergence analysis for Frank-Wolfe algorithms. NIPS 2013 Workshop on G...
Saved in:
Published in: | Mathematical programming 2017-07, Vol.164 (1-2), p.1-27 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c382t-44e3c0a1a4172368a3afaa4f775cb87bbc12cd7b66be8a4b3d6d14b4fa6c07b43 |
---|---|
cites | cdi_FETCH-LOGICAL-c382t-44e3c0a1a4172368a3afaa4f775cb87bbc12cd7b66be8a4b3d6d14b4fa6c07b43 |
container_end_page | 27 |
container_issue | 1-2 |
container_start_page | 1 |
container_title | Mathematical programming |
container_volume | 164 |
creator | Beck, Amir Shtern, Shimrit |
description | We consider the problem of minimizing the sum of a linear function and a composition of a strongly convex function with a linear transformation over a compact polyhedral set. Jaggi and Lacoste-Julien (An affine invariant linear convergence analysis for Frank-Wolfe algorithms. NIPS 2013 Workshop on Greedy Algorithms, Frank-Wolfe and Friends,
2014
) show that the conditional gradient method with away steps — employed on the aforementioned problem without the additional linear term — has a linear rate of convergence, depending on the so-called pyramidal width of the feasible set. We revisit this result and provide a variant of the algorithm and an analysis based on simple linear programming duality arguments, as well as corresponding error bounds. This new analysis (a) enables the incorporation of the additional linear term, and (b) depends on a new constant, that is explicitly expressed in terms of the problem’s parameters and the geometry of the feasible set. This constant replaces the pyramidal width, which is difficult to evaluate. |
doi_str_mv | 10.1007/s10107-016-1069-4 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_1907719672</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1907719672</sourcerecordid><originalsourceid>FETCH-LOGICAL-c382t-44e3c0a1a4172368a3afaa4f775cb87bbc12cd7b66be8a4b3d6d14b4fa6c07b43</originalsourceid><addsrcrecordid>eNp1kE1LxDAQhoMouK7-AG8Fz9GZJpt0jyJ-wYKX9RwmaVq6rMmadNX997ZUwYungZnnfRkexi4RrhFA32QEBM0BFUdQSy6P2AylUFwqqY7ZDKBc8IVCOGVnOW8AAEVVzdh61QVPaXsoXAwfPrU-9AV90oHn3u_GZd31XQy0LdpEdTeem5iKEMNApBja3-hX0eyDG9l8zk4a2mZ_8TPn7PXhfn33xFcvj893tyvuRFX2XEovHBCSRF0KVZGghkg2Wi-crbS1DktXa6uU9RVJK2pVo7SyIeVAWynm7Grq3aX4vve5N5u4T8Ov2eAStMalGornDCfKpZhz8o3Zpe6N0sEgmFGemeSZQZ4Z5ZmxuZwyeWBD69Of5n9D38ENc7s</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1907719672</pqid></control><display><type>article</type><title>Linearly convergent away-step conditional gradient for non-strongly convex functions</title><source>Business Source Ultimate</source><source>ABI/INFORM Global</source><source>Springer Nature</source><creator>Beck, Amir ; Shtern, Shimrit</creator><creatorcontrib>Beck, Amir ; Shtern, Shimrit</creatorcontrib><description>We consider the problem of minimizing the sum of a linear function and a composition of a strongly convex function with a linear transformation over a compact polyhedral set. Jaggi and Lacoste-Julien (An affine invariant linear convergence analysis for Frank-Wolfe algorithms. NIPS 2013 Workshop on Greedy Algorithms, Frank-Wolfe and Friends,
2014
) show that the conditional gradient method with away steps — employed on the aforementioned problem without the additional linear term — has a linear rate of convergence, depending on the so-called pyramidal width of the feasible set. We revisit this result and provide a variant of the algorithm and an analysis based on simple linear programming duality arguments, as well as corresponding error bounds. This new analysis (a) enables the incorporation of the additional linear term, and (b) depends on a new constant, that is explicitly expressed in terms of the problem’s parameters and the geometry of the feasible set. This constant replaces the pyramidal width, which is difficult to evaluate.</description><identifier>ISSN: 0025-5610</identifier><identifier>EISSN: 1436-4646</identifier><identifier>DOI: 10.1007/s10107-016-1069-4</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Algorithms ; Calculus of Variations and Optimal Control; Optimization ; Combinatorics ; Convergence ; Error analysis ; Feasibility ; Full Length Paper ; Greedy algorithms ; Linear programming ; Mathematical and Computational Physics ; Mathematical Methods in Physics ; Mathematics ; Mathematics and Statistics ; Mathematics of Computing ; Numerical Analysis ; S parameters ; Theoretical</subject><ispartof>Mathematical programming, 2017-07, Vol.164 (1-2), p.1-27</ispartof><rights>Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society 2016</rights><rights>Mathematical Programming is a copyright of Springer, 2017.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c382t-44e3c0a1a4172368a3afaa4f775cb87bbc12cd7b66be8a4b3d6d14b4fa6c07b43</citedby><cites>FETCH-LOGICAL-c382t-44e3c0a1a4172368a3afaa4f775cb87bbc12cd7b66be8a4b3d6d14b4fa6c07b43</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/1907719672?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,776,780,11667,27901,27902,36037,44339</link.rule.ids></links><search><creatorcontrib>Beck, Amir</creatorcontrib><creatorcontrib>Shtern, Shimrit</creatorcontrib><title>Linearly convergent away-step conditional gradient for non-strongly convex functions</title><title>Mathematical programming</title><addtitle>Math. Program</addtitle><description>We consider the problem of minimizing the sum of a linear function and a composition of a strongly convex function with a linear transformation over a compact polyhedral set. Jaggi and Lacoste-Julien (An affine invariant linear convergence analysis for Frank-Wolfe algorithms. NIPS 2013 Workshop on Greedy Algorithms, Frank-Wolfe and Friends,
2014
) show that the conditional gradient method with away steps — employed on the aforementioned problem without the additional linear term — has a linear rate of convergence, depending on the so-called pyramidal width of the feasible set. We revisit this result and provide a variant of the algorithm and an analysis based on simple linear programming duality arguments, as well as corresponding error bounds. This new analysis (a) enables the incorporation of the additional linear term, and (b) depends on a new constant, that is explicitly expressed in terms of the problem’s parameters and the geometry of the feasible set. This constant replaces the pyramidal width, which is difficult to evaluate.</description><subject>Algorithms</subject><subject>Calculus of Variations and Optimal Control; Optimization</subject><subject>Combinatorics</subject><subject>Convergence</subject><subject>Error analysis</subject><subject>Feasibility</subject><subject>Full Length Paper</subject><subject>Greedy algorithms</subject><subject>Linear programming</subject><subject>Mathematical and Computational Physics</subject><subject>Mathematical Methods in Physics</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Mathematics of Computing</subject><subject>Numerical Analysis</subject><subject>S parameters</subject><subject>Theoretical</subject><issn>0025-5610</issn><issn>1436-4646</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>M0C</sourceid><recordid>eNp1kE1LxDAQhoMouK7-AG8Fz9GZJpt0jyJ-wYKX9RwmaVq6rMmadNX997ZUwYungZnnfRkexi4RrhFA32QEBM0BFUdQSy6P2AylUFwqqY7ZDKBc8IVCOGVnOW8AAEVVzdh61QVPaXsoXAwfPrU-9AV90oHn3u_GZd31XQy0LdpEdTeem5iKEMNApBja3-hX0eyDG9l8zk4a2mZ_8TPn7PXhfn33xFcvj893tyvuRFX2XEovHBCSRF0KVZGghkg2Wi-crbS1DktXa6uU9RVJK2pVo7SyIeVAWynm7Grq3aX4vve5N5u4T8Ov2eAStMalGornDCfKpZhz8o3Zpe6N0sEgmFGemeSZQZ4Z5ZmxuZwyeWBD69Of5n9D38ENc7s</recordid><startdate>20170701</startdate><enddate>20170701</enddate><creator>Beck, Amir</creator><creator>Shtern, Shimrit</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M2P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope></search><sort><creationdate>20170701</creationdate><title>Linearly convergent away-step conditional gradient for non-strongly convex functions</title><author>Beck, Amir ; Shtern, Shimrit</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c382t-44e3c0a1a4172368a3afaa4f775cb87bbc12cd7b66be8a4b3d6d14b4fa6c07b43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Algorithms</topic><topic>Calculus of Variations and Optimal Control; Optimization</topic><topic>Combinatorics</topic><topic>Convergence</topic><topic>Error analysis</topic><topic>Feasibility</topic><topic>Full Length Paper</topic><topic>Greedy algorithms</topic><topic>Linear programming</topic><topic>Mathematical and Computational Physics</topic><topic>Mathematical Methods in Physics</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Mathematics of Computing</topic><topic>Numerical Analysis</topic><topic>S parameters</topic><topic>Theoretical</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Beck, Amir</creatorcontrib><creatorcontrib>Shtern, Shimrit</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Mathematical programming</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Beck, Amir</au><au>Shtern, Shimrit</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Linearly convergent away-step conditional gradient for non-strongly convex functions</atitle><jtitle>Mathematical programming</jtitle><stitle>Math. Program</stitle><date>2017-07-01</date><risdate>2017</risdate><volume>164</volume><issue>1-2</issue><spage>1</spage><epage>27</epage><pages>1-27</pages><issn>0025-5610</issn><eissn>1436-4646</eissn><abstract>We consider the problem of minimizing the sum of a linear function and a composition of a strongly convex function with a linear transformation over a compact polyhedral set. Jaggi and Lacoste-Julien (An affine invariant linear convergence analysis for Frank-Wolfe algorithms. NIPS 2013 Workshop on Greedy Algorithms, Frank-Wolfe and Friends,
2014
) show that the conditional gradient method with away steps — employed on the aforementioned problem without the additional linear term — has a linear rate of convergence, depending on the so-called pyramidal width of the feasible set. We revisit this result and provide a variant of the algorithm and an analysis based on simple linear programming duality arguments, as well as corresponding error bounds. This new analysis (a) enables the incorporation of the additional linear term, and (b) depends on a new constant, that is explicitly expressed in terms of the problem’s parameters and the geometry of the feasible set. This constant replaces the pyramidal width, which is difficult to evaluate.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s10107-016-1069-4</doi><tpages>27</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0025-5610 |
ispartof | Mathematical programming, 2017-07, Vol.164 (1-2), p.1-27 |
issn | 0025-5610 1436-4646 |
language | eng |
recordid | cdi_proquest_journals_1907719672 |
source | Business Source Ultimate; ABI/INFORM Global; Springer Nature |
subjects | Algorithms Calculus of Variations and Optimal Control Optimization Combinatorics Convergence Error analysis Feasibility Full Length Paper Greedy algorithms Linear programming Mathematical and Computational Physics Mathematical Methods in Physics Mathematics Mathematics and Statistics Mathematics of Computing Numerical Analysis S parameters Theoretical |
title | Linearly convergent away-step conditional gradient for non-strongly convex functions |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T21%3A20%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Linearly%20convergent%20away-step%20conditional%20gradient%20for%20non-strongly%20convex%20functions&rft.jtitle=Mathematical%20programming&rft.au=Beck,%20Amir&rft.date=2017-07-01&rft.volume=164&rft.issue=1-2&rft.spage=1&rft.epage=27&rft.pages=1-27&rft.issn=0025-5610&rft.eissn=1436-4646&rft_id=info:doi/10.1007/s10107-016-1069-4&rft_dat=%3Cproquest_cross%3E1907719672%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c382t-44e3c0a1a4172368a3afaa4f775cb87bbc12cd7b66be8a4b3d6d14b4fa6c07b43%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1907719672&rft_id=info:pmid/&rfr_iscdi=true |