Loading…

Adversarial sampling of unknown and high-dimensional conditional distributions

Illustration of the outcome of the algorithm. One can generate many high-dimensional samples (ξSR) which are consistent with given low-resolution conditional continuous variables (ξLR). •Conditional GANs can be regularized with estimates of conditional moments.•Moments can be estimated with external...

Full description

Saved in:
Bibliographic Details
Published in:Journal of computational physics 2022-02, Vol.450 (C), p.110853, Article 110853
Main Authors: Hassanaly, Malik, Glaws, Andrew, Stengel, Karen, King, Ryan N.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Illustration of the outcome of the algorithm. One can generate many high-dimensional samples (ξSR) which are consistent with given low-resolution conditional continuous variables (ξLR). •Conditional GANs can be regularized with estimates of conditional moments.•Moments can be estimated with external neural nets or stochastic estimation.•Moments can be used to evaluate the diversity of generated samples.•The method outperforms state-of-the-art methods for image deconvolution. Many engineering problems require the prediction of realization-to-realization variability or a refined description of modeled quantities. In that case, it is necessary to sample elements from unknown high-dimensional spaces with possibly millions of degrees of freedom. While there exist methods able to sample elements from probability density functions (PDF) with known shapes, several approximations need to be made when the distribution is unknown. In this paper the sampling method, as well as the inference of the underlying distribution, are both handled with a data-driven method known as generative adversarial networks (GAN), which trains two competing neural networks to produce a network that can effectively generate samples from the training set distribution. In practice, it is often necessary to draw samples from conditional distributions. When the conditional variables are continuous, only one (if any) data point corresponding to a particular value of a conditioning variable may be available, which is not sufficient to estimate the conditional distribution. This work handles this problem using an a priori estimation of the conditional moments of a PDF. Two approaches, stochastic estimation, and an external neural network are compared here for computing these moments; however, any preferred method can be used. The algorithm is demonstrated in the case of the deconvolution of a filtered turbulent flow field. It is shown that all the versions of the proposed algorithm effectively sample the target conditional distribution with minimal impact on the quality of the samples compared to state-of-the-art methods. Additionally, the procedure can be used as a metric for the diversity of samples generated by a conditional GAN (cGAN) conditioned with continuous variables.
ISSN:0021-9991
1090-2716
DOI:10.1016/j.jcp.2021.110853