Loading…

DiG: Scalable and Efficient Diffusion Models with Gated Linear Attention

Diffusion models with large-scale pre-training have achieved significant success in the field of visual content generation, particularly exemplified by Diffusion Transformers (DiT). However, DiT models have faced challenges with quadratic complexity efficiency, especially when handling long sequence...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2024-11
Main Authors: Zhu, Lianghui, Huang, Zilong, Liao, Bencheng, Jun Hao Liew, Hanshu Yan, Feng, Jiashi, Wang, Xinggang
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Diffusion models with large-scale pre-training have achieved significant success in the field of visual content generation, particularly exemplified by Diffusion Transformers (DiT). However, DiT models have faced challenges with quadratic complexity efficiency, especially when handling long sequences. In this paper, we aim to incorporate the sub-quadratic modeling capability of Gated Linear Attention (GLA) into the 2D diffusion backbone. Specifically, we introduce Diffusion Gated Linear Attention Transformers (DiG), a simple, adoptable solution with minimal parameter overhead. We offer two variants, i,e, a plain and U-shape architecture, showing superior efficiency and competitive effectiveness. In addition to superior performance to DiT and other sub-quadratic-time diffusion models at \(256 \times 256\) resolution, DiG demonstrates greater efficiency than these methods starting from a \(512\) resolution. Specifically, DiG-S/2 is \(2.5\times\) faster and saves \(75.7\%\) GPU memory compared to DiT-S/2 at a \(1792\) resolution. Additionally, DiG-XL/2 is \(4.2\times\) faster than the Mamba-based model at a \(1024\) resolution and \(1.8\times\) faster than DiT with FlashAttention-2 at a \(2048\) resolution. We will release the code soon. Code is released at https://github.com/hustvl/DiG.
ISSN:2331-8422