Loading…

Nearly \(d\)-Linear Convergence Bounds for Diffusion Models via Stochastic Localization

Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming \(L^2\)-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or r...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2024-03
Main Authors: Benton, Joe, De Bortoli, Valentin, Doucet, Arnaud, Deligiannidis, George
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Benton, Joe
De Bortoli, Valentin
Doucet, Arnaud
Deligiannidis, George
description Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming \(L^2\)-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or required strong smoothness assumptions. We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution. We show that diffusion models require at most \(\tilde O(\frac{d \log^2(1/\delta)}{\varepsilon^2})\) steps to approximate an arbitrary distribution on \(\mathbb{R}^d\) corrupted with Gaussian noise of variance \(\delta\) to within \(\varepsilon^2\) in KL divergence. Our proof extends the Girsanov-based methods of previous works. We introduce a refined treatment of the error from discretizing the reverse SDE inspired by stochastic localization.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2847572677</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2847572677</sourcerecordid><originalsourceid>FETCH-proquest_journals_28475726773</originalsourceid><addsrcrecordid>eNqNjr0KwjAURoMgWLTvcMFFh0JNW9PZqjioi4JLoYQ21ZSQq_kp6NPbwQdw-jicM3wjEtAkWUV5SumEhNZ2cRzTNaNZlgTkdhbcqDeUi6ZcRkepB4QCdS_MXehawAa9biy0aGAr29ZbiRpO2AhloZccLg7rB7dO1nDEmiv54W5IZmTccmVF-Nspme931-IQPQ2-vLCu6tAbPaiK5inL2PCIJf9VX_PlQTQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2847572677</pqid></control><display><type>article</type><title>Nearly \(d\)-Linear Convergence Bounds for Diffusion Models via Stochastic Localization</title><source>Publicly Available Content Database</source><creator>Benton, Joe ; De Bortoli, Valentin ; Doucet, Arnaud ; Deligiannidis, George</creator><creatorcontrib>Benton, Joe ; De Bortoli, Valentin ; Doucet, Arnaud ; Deligiannidis, George</creatorcontrib><description>Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming \(L^2\)-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or required strong smoothness assumptions. We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution. We show that diffusion models require at most \(\tilde O(\frac{d \log^2(1/\delta)}{\varepsilon^2})\) steps to approximate an arbitrary distribution on \(\mathbb{R}^d\) corrupted with Gaussian noise of variance \(\delta\) to within \(\varepsilon^2\) in KL divergence. Our proof extends the Girsanov-based methods of previous works. We introduce a refined treatment of the error from discretizing the reverse SDE inspired by stochastic localization.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Convergence ; Divergence ; Localization ; Polynomials ; Random noise ; Smoothness</subject><ispartof>arXiv.org, 2024-03</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2847572677?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Benton, Joe</creatorcontrib><creatorcontrib>De Bortoli, Valentin</creatorcontrib><creatorcontrib>Doucet, Arnaud</creatorcontrib><creatorcontrib>Deligiannidis, George</creatorcontrib><title>Nearly \(d\)-Linear Convergence Bounds for Diffusion Models via Stochastic Localization</title><title>arXiv.org</title><description>Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming \(L^2\)-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or required strong smoothness assumptions. We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution. We show that diffusion models require at most \(\tilde O(\frac{d \log^2(1/\delta)}{\varepsilon^2})\) steps to approximate an arbitrary distribution on \(\mathbb{R}^d\) corrupted with Gaussian noise of variance \(\delta\) to within \(\varepsilon^2\) in KL divergence. Our proof extends the Girsanov-based methods of previous works. We introduce a refined treatment of the error from discretizing the reverse SDE inspired by stochastic localization.</description><subject>Convergence</subject><subject>Divergence</subject><subject>Localization</subject><subject>Polynomials</subject><subject>Random noise</subject><subject>Smoothness</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNjr0KwjAURoMgWLTvcMFFh0JNW9PZqjioi4JLoYQ21ZSQq_kp6NPbwQdw-jicM3wjEtAkWUV5SumEhNZ2cRzTNaNZlgTkdhbcqDeUi6ZcRkepB4QCdS_MXehawAa9biy0aGAr29ZbiRpO2AhloZccLg7rB7dO1nDEmiv54W5IZmTccmVF-Nspme931-IQPQ2-vLCu6tAbPaiK5inL2PCIJf9VX_PlQTQ</recordid><startdate>20240306</startdate><enddate>20240306</enddate><creator>Benton, Joe</creator><creator>De Bortoli, Valentin</creator><creator>Doucet, Arnaud</creator><creator>Deligiannidis, George</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240306</creationdate><title>Nearly \(d\)-Linear Convergence Bounds for Diffusion Models via Stochastic Localization</title><author>Benton, Joe ; De Bortoli, Valentin ; Doucet, Arnaud ; Deligiannidis, George</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28475726773</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Convergence</topic><topic>Divergence</topic><topic>Localization</topic><topic>Polynomials</topic><topic>Random noise</topic><topic>Smoothness</topic><toplevel>online_resources</toplevel><creatorcontrib>Benton, Joe</creatorcontrib><creatorcontrib>De Bortoli, Valentin</creatorcontrib><creatorcontrib>Doucet, Arnaud</creatorcontrib><creatorcontrib>Deligiannidis, George</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Benton, Joe</au><au>De Bortoli, Valentin</au><au>Doucet, Arnaud</au><au>Deligiannidis, George</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Nearly \(d\)-Linear Convergence Bounds for Diffusion Models via Stochastic Localization</atitle><jtitle>arXiv.org</jtitle><date>2024-03-06</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming \(L^2\)-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or required strong smoothness assumptions. We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution. We show that diffusion models require at most \(\tilde O(\frac{d \log^2(1/\delta)}{\varepsilon^2})\) steps to approximate an arbitrary distribution on \(\mathbb{R}^d\) corrupted with Gaussian noise of variance \(\delta\) to within \(\varepsilon^2\) in KL divergence. Our proof extends the Girsanov-based methods of previous works. We introduce a refined treatment of the error from discretizing the reverse SDE inspired by stochastic localization.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-03
issn 2331-8422
language eng
recordid cdi_proquest_journals_2847572677
source Publicly Available Content Database
subjects Convergence
Divergence
Localization
Polynomials
Random noise
Smoothness
title Nearly \(d\)-Linear Convergence Bounds for Diffusion Models via Stochastic Localization
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-30T22%3A21%3A47IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Nearly%20%5C(d%5C)-Linear%20Convergence%20Bounds%20for%20Diffusion%20Models%20via%20Stochastic%20Localization&rft.jtitle=arXiv.org&rft.au=Benton,%20Joe&rft.date=2024-03-06&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2847572677%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_28475726773%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2847572677&rft_id=info:pmid/&rfr_iscdi=true