Loading…
A Characterization of Quantum Generative Models
Quantum generative modeling is a growing area of interest for industry-relevant applications. This work systematically compares a broad range of techniques to guide quantum computing practitioners when deciding which models and methods to use in their applications. We compare fundamentally different...
Saved in:
Published in: | ACM transactions on quantum computing (Print) 2024-06, Vol.5 (2), p.1-34, Article 12 |
---|---|
Main Authors: | , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Quantum generative modeling is a growing area of interest for industry-relevant applications. This work systematically compares a broad range of techniques to guide quantum computing practitioners when deciding which models and methods to use in their applications. We compare fundamentally different architectural ansatzes of parametric quantum circuits: (1) A continuous architecture, which produces continuous-valued data samples, and (2) a discrete architecture, which samples on a discrete grid. We also compare the performance of different data transformations: the min-max and the probability integral transforms. We use two popular training methods: (1) quantum circuit Born machines (QCBM), and (2) quantum generative adversarial networks (QGAN). We study their performance and tradeoffs as the number of model parameters increases, with a baseline comparison of similarly trained classical neural networks. The study is performed on six low-dimensional synthetic and two real financial data sets. Our two key findings are that: (1) For all data sets, our quantum models require similar or fewer parameters than their classical counterparts. In the extreme case, the quantum models require two orders of magnitude less parameters. (2) We empirically find that a variant of the discrete architecture, which learns the copula of the probability distribution, outperforms all other methods. |
---|---|
ISSN: | 2643-6809 2643-6817 |
DOI: | 10.1145/3655027 |