Loading…

Nearly \(d\)-Linear Convergence Bounds for Diffusion Models via Stochastic Localization

Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming \(L^2\)-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or r...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2024-03
Main Authors: Benton, Joe, De Bortoli, Valentin, Doucet, Arnaud, Deligiannidis, George
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming \(L^2\)-accurate scores. Until now, the tightest bounds were either superlinear in the data dimension or required strong smoothness assumptions. We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution. We show that diffusion models require at most \(\tilde O(\frac{d \log^2(1/\delta)}{\varepsilon^2})\) steps to approximate an arbitrary distribution on \(\mathbb{R}^d\) corrupted with Gaussian noise of variance \(\delta\) to within \(\varepsilon^2\) in KL divergence. Our proof extends the Girsanov-based methods of previous works. We introduce a refined treatment of the error from discretizing the reverse SDE inspired by stochastic localization.
ISSN:2331-8422