Loading…

Scalable balanced training of conditional generative adversarial neural networks on image data

We propose a distributed approach to train deep convolutional generative adversarial neural network (DC-CGANs) models. Our method reduces the imbalance between generator and discriminator by partitioning the training data according to data labels, and enhances scalability by performing a parallel tr...

Full description

Saved in:
Bibliographic Details
Published in:The Journal of supercomputing 2021-11, Vol.77 (11), p.13358-13384
Main Authors: Lupo Pasini, Massimiliano, Gabbi, Vittorio, Yin, Junqi, Perotto, Simona, Laanait, Nouamane
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a distributed approach to train deep convolutional generative adversarial neural network (DC-CGANs) models. Our method reduces the imbalance between generator and discriminator by partitioning the training data according to data labels, and enhances scalability by performing a parallel training where multiple generators are concurrently trained, each one of them focusing on a single data label. Performance is assessed in terms of inception score, Fréchet inception distance, and image quality on MNIST, CIFAR10, CIFAR100, and ImageNet1k datasets, showing a significant improvement in comparison to state-of-the-art techniques to training DC-CGANs. Weak scaling is attained on all the four datasets using up to 1000 processes and 2000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.
ISSN:0920-8542
1573-0484
DOI:10.1007/s11227-021-03808-2