Loading…

Some greedy algorithms for sparse polynomial chaos expansions

Compressed sensing algorithms approximate functions using limited evaluations by searching for a sparse representation among a dictionary of basis functions. Orthogonal matching pursuit (OMP) is a greedy algorithm for selecting basis functions whose computational cost scales with the size of the dic...

Full description

Saved in:
Bibliographic Details
Published in:Journal of computational physics 2019-06, Vol.387, p.303-325
Main Authors: Baptista, Ricardo, Stolbunov, Valentin, Nair, Prasanth B.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c325t-d1122eb5f5b5c75533ced32164dada045358b0fd455a4d72aa7d671a4b110fdf3
cites cdi_FETCH-LOGICAL-c325t-d1122eb5f5b5c75533ced32164dada045358b0fd455a4d72aa7d671a4b110fdf3
container_end_page 325
container_issue
container_start_page 303
container_title Journal of computational physics
container_volume 387
creator Baptista, Ricardo
Stolbunov, Valentin
Nair, Prasanth B.
description Compressed sensing algorithms approximate functions using limited evaluations by searching for a sparse representation among a dictionary of basis functions. Orthogonal matching pursuit (OMP) is a greedy algorithm for selecting basis functions whose computational cost scales with the size of the dictionary. For polynomial chaos (PC) approximations, the size of the dictionary grows quickly with the number of model inputs and the maximum polynomial degree, making them often prohibitive to use with greedy methods. We propose two new algorithms for efficiently constructing sparse PC expansions for problems with high-dimensional inputs. The first algorithm is a parallel OMP method coupled with an incremental QR factorization scheme, wherein the model construction step is interleaved with a ν-fold cross-validation procedure. The second approach is a randomized greedy algorithm that leverages a probabilistic argument to only evaluate a subset of basis functions from the dictionary at each iteration of the incremental algorithm. The randomized algorithm is demonstrated to recover model outputs with a similar level of sparsity and accuracy as OMP, but with a cost that is independent of the dictionary size. Both algorithms are validated with a numerical comparison of their performance on a series of algebraic test problems and PDEs with high-dimensional inputs. •We consider the problem of efficiently learning sparse polynomial chaos expansions.•A novel parallelization strategy is proposed.•A randomized greedy algorithm is developed to handle large-scale problems.•A detailed theoretical analysis is carried out to evaluate the proposed algorithms.
doi_str_mv 10.1016/j.jcp.2019.01.035
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2230283871</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0021999119300865</els_id><sourcerecordid>2230283871</sourcerecordid><originalsourceid>FETCH-LOGICAL-c325t-d1122eb5f5b5c75533ced32164dada045358b0fd455a4d72aa7d671a4b110fdf3</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMouK7-AG8Fz60zSdMPxIMsfsGCB_Uc0iTdTWmbmnTF_fdmWc-eBob3mXl5CLlGyBCwuO2yTk0ZBawzwAwYPyELhBpSWmJxShYAFNO6rvGcXITQAUDF82pB7t_dYJKNN0bvE9lvnLfzdghJ63wSJumDSSbX70c3WNknaitdSMzPJMdg3RguyVkr-2Cu_uaSfD49fqxe0vXb8-vqYZ0qRvmcakRKTcNb3nBVcs6YMppRLHIttYScM1410Oqcc5nrkkpZ6qJEmTeIcd2yJbk53p28-9qZMIvO7fwYXwpKGdCKVSXGFB5TyrsQvGnF5O0g_V4giIMl0YloSRwsCUARLUXm7siYWP_bGi-CsmaM_aw3ahba2X_oX_alb5w</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2230283871</pqid></control><display><type>article</type><title>Some greedy algorithms for sparse polynomial chaos expansions</title><source>ScienceDirect Journals</source><creator>Baptista, Ricardo ; Stolbunov, Valentin ; Nair, Prasanth B.</creator><creatorcontrib>Baptista, Ricardo ; Stolbunov, Valentin ; Nair, Prasanth B.</creatorcontrib><description>Compressed sensing algorithms approximate functions using limited evaluations by searching for a sparse representation among a dictionary of basis functions. Orthogonal matching pursuit (OMP) is a greedy algorithm for selecting basis functions whose computational cost scales with the size of the dictionary. For polynomial chaos (PC) approximations, the size of the dictionary grows quickly with the number of model inputs and the maximum polynomial degree, making them often prohibitive to use with greedy methods. We propose two new algorithms for efficiently constructing sparse PC expansions for problems with high-dimensional inputs. The first algorithm is a parallel OMP method coupled with an incremental QR factorization scheme, wherein the model construction step is interleaved with a ν-fold cross-validation procedure. The second approach is a randomized greedy algorithm that leverages a probabilistic argument to only evaluate a subset of basis functions from the dictionary at each iteration of the incremental algorithm. The randomized algorithm is demonstrated to recover model outputs with a similar level of sparsity and accuracy as OMP, but with a cost that is independent of the dictionary size. Both algorithms are validated with a numerical comparison of their performance on a series of algebraic test problems and PDEs with high-dimensional inputs. •We consider the problem of efficiently learning sparse polynomial chaos expansions.•A novel parallelization strategy is proposed.•A randomized greedy algorithm is developed to handle large-scale problems.•A detailed theoretical analysis is carried out to evaluate the proposed algorithms.</description><identifier>ISSN: 0021-9991</identifier><identifier>EISSN: 1090-2716</identifier><identifier>DOI: 10.1016/j.jcp.2019.01.035</identifier><language>eng</language><publisher>Cambridge: Elsevier Inc</publisher><subject>Algorithms ; Basis functions ; Compressed sensing ; Computational physics ; Dictionaries ; Greedy algorithms ; Iterative methods ; Mathematical analysis ; Personal computers ; Polynomial chaos ; Polynomials ; Randomization ; Set theory ; Stochastic models ; Uncertainty quantification</subject><ispartof>Journal of computational physics, 2019-06, Vol.387, p.303-325</ispartof><rights>2019 Elsevier Inc.</rights><rights>Copyright Elsevier Science Ltd. Jun 15, 2019</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c325t-d1122eb5f5b5c75533ced32164dada045358b0fd455a4d72aa7d671a4b110fdf3</citedby><cites>FETCH-LOGICAL-c325t-d1122eb5f5b5c75533ced32164dada045358b0fd455a4d72aa7d671a4b110fdf3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27923,27924</link.rule.ids></links><search><creatorcontrib>Baptista, Ricardo</creatorcontrib><creatorcontrib>Stolbunov, Valentin</creatorcontrib><creatorcontrib>Nair, Prasanth B.</creatorcontrib><title>Some greedy algorithms for sparse polynomial chaos expansions</title><title>Journal of computational physics</title><description>Compressed sensing algorithms approximate functions using limited evaluations by searching for a sparse representation among a dictionary of basis functions. Orthogonal matching pursuit (OMP) is a greedy algorithm for selecting basis functions whose computational cost scales with the size of the dictionary. For polynomial chaos (PC) approximations, the size of the dictionary grows quickly with the number of model inputs and the maximum polynomial degree, making them often prohibitive to use with greedy methods. We propose two new algorithms for efficiently constructing sparse PC expansions for problems with high-dimensional inputs. The first algorithm is a parallel OMP method coupled with an incremental QR factorization scheme, wherein the model construction step is interleaved with a ν-fold cross-validation procedure. The second approach is a randomized greedy algorithm that leverages a probabilistic argument to only evaluate a subset of basis functions from the dictionary at each iteration of the incremental algorithm. The randomized algorithm is demonstrated to recover model outputs with a similar level of sparsity and accuracy as OMP, but with a cost that is independent of the dictionary size. Both algorithms are validated with a numerical comparison of their performance on a series of algebraic test problems and PDEs with high-dimensional inputs. •We consider the problem of efficiently learning sparse polynomial chaos expansions.•A novel parallelization strategy is proposed.•A randomized greedy algorithm is developed to handle large-scale problems.•A detailed theoretical analysis is carried out to evaluate the proposed algorithms.</description><subject>Algorithms</subject><subject>Basis functions</subject><subject>Compressed sensing</subject><subject>Computational physics</subject><subject>Dictionaries</subject><subject>Greedy algorithms</subject><subject>Iterative methods</subject><subject>Mathematical analysis</subject><subject>Personal computers</subject><subject>Polynomial chaos</subject><subject>Polynomials</subject><subject>Randomization</subject><subject>Set theory</subject><subject>Stochastic models</subject><subject>Uncertainty quantification</subject><issn>0021-9991</issn><issn>1090-2716</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LxDAQhoMouK7-AG8Fz60zSdMPxIMsfsGCB_Uc0iTdTWmbmnTF_fdmWc-eBob3mXl5CLlGyBCwuO2yTk0ZBawzwAwYPyELhBpSWmJxShYAFNO6rvGcXITQAUDF82pB7t_dYJKNN0bvE9lvnLfzdghJ63wSJumDSSbX70c3WNknaitdSMzPJMdg3RguyVkr-2Cu_uaSfD49fqxe0vXb8-vqYZ0qRvmcakRKTcNb3nBVcs6YMppRLHIttYScM1410Oqcc5nrkkpZ6qJEmTeIcd2yJbk53p28-9qZMIvO7fwYXwpKGdCKVSXGFB5TyrsQvGnF5O0g_V4giIMl0YloSRwsCUARLUXm7siYWP_bGi-CsmaM_aw3ahba2X_oX_alb5w</recordid><startdate>20190615</startdate><enddate>20190615</enddate><creator>Baptista, Ricardo</creator><creator>Stolbunov, Valentin</creator><creator>Nair, Prasanth B.</creator><general>Elsevier Inc</general><general>Elsevier Science Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20190615</creationdate><title>Some greedy algorithms for sparse polynomial chaos expansions</title><author>Baptista, Ricardo ; Stolbunov, Valentin ; Nair, Prasanth B.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c325t-d1122eb5f5b5c75533ced32164dada045358b0fd455a4d72aa7d671a4b110fdf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Basis functions</topic><topic>Compressed sensing</topic><topic>Computational physics</topic><topic>Dictionaries</topic><topic>Greedy algorithms</topic><topic>Iterative methods</topic><topic>Mathematical analysis</topic><topic>Personal computers</topic><topic>Polynomial chaos</topic><topic>Polynomials</topic><topic>Randomization</topic><topic>Set theory</topic><topic>Stochastic models</topic><topic>Uncertainty quantification</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Baptista, Ricardo</creatorcontrib><creatorcontrib>Stolbunov, Valentin</creatorcontrib><creatorcontrib>Nair, Prasanth B.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of computational physics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Baptista, Ricardo</au><au>Stolbunov, Valentin</au><au>Nair, Prasanth B.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Some greedy algorithms for sparse polynomial chaos expansions</atitle><jtitle>Journal of computational physics</jtitle><date>2019-06-15</date><risdate>2019</risdate><volume>387</volume><spage>303</spage><epage>325</epage><pages>303-325</pages><issn>0021-9991</issn><eissn>1090-2716</eissn><abstract>Compressed sensing algorithms approximate functions using limited evaluations by searching for a sparse representation among a dictionary of basis functions. Orthogonal matching pursuit (OMP) is a greedy algorithm for selecting basis functions whose computational cost scales with the size of the dictionary. For polynomial chaos (PC) approximations, the size of the dictionary grows quickly with the number of model inputs and the maximum polynomial degree, making them often prohibitive to use with greedy methods. We propose two new algorithms for efficiently constructing sparse PC expansions for problems with high-dimensional inputs. The first algorithm is a parallel OMP method coupled with an incremental QR factorization scheme, wherein the model construction step is interleaved with a ν-fold cross-validation procedure. The second approach is a randomized greedy algorithm that leverages a probabilistic argument to only evaluate a subset of basis functions from the dictionary at each iteration of the incremental algorithm. The randomized algorithm is demonstrated to recover model outputs with a similar level of sparsity and accuracy as OMP, but with a cost that is independent of the dictionary size. Both algorithms are validated with a numerical comparison of their performance on a series of algebraic test problems and PDEs with high-dimensional inputs. •We consider the problem of efficiently learning sparse polynomial chaos expansions.•A novel parallelization strategy is proposed.•A randomized greedy algorithm is developed to handle large-scale problems.•A detailed theoretical analysis is carried out to evaluate the proposed algorithms.</abstract><cop>Cambridge</cop><pub>Elsevier Inc</pub><doi>10.1016/j.jcp.2019.01.035</doi><tpages>23</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0021-9991
ispartof Journal of computational physics, 2019-06, Vol.387, p.303-325
issn 0021-9991
1090-2716
language eng
recordid cdi_proquest_journals_2230283871
source ScienceDirect Journals
subjects Algorithms
Basis functions
Compressed sensing
Computational physics
Dictionaries
Greedy algorithms
Iterative methods
Mathematical analysis
Personal computers
Polynomial chaos
Polynomials
Randomization
Set theory
Stochastic models
Uncertainty quantification
title Some greedy algorithms for sparse polynomial chaos expansions
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T19%3A21%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Some%20greedy%20algorithms%20for%20sparse%20polynomial%20chaos%20expansions&rft.jtitle=Journal%20of%20computational%20physics&rft.au=Baptista,%20Ricardo&rft.date=2019-06-15&rft.volume=387&rft.spage=303&rft.epage=325&rft.pages=303-325&rft.issn=0021-9991&rft.eissn=1090-2716&rft_id=info:doi/10.1016/j.jcp.2019.01.035&rft_dat=%3Cproquest_cross%3E2230283871%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c325t-d1122eb5f5b5c75533ced32164dada045358b0fd455a4d72aa7d671a4b110fdf3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2230283871&rft_id=info:pmid/&rfr_iscdi=true