Loading…

Power of data in quantum machine learning

The use of quantum computing for machine learning is among the most exciting prospective applications of quantum technologies. However, machine learning tasks where data is provided can be considerably different than commonly studied computational tasks. In this work, we show that some problems that...

Full description

Saved in:
Bibliographic Details
Published in:Nature communications 2021-05, Vol.12 (1), p.2631-2631, Article 2631
Main Authors: Huang, Hsin-Yuan, Broughton, Michael, Mohseni, Masoud, Babbush, Ryan, Boixo, Sergio, Neven, Hartmut, McClean, Jarrod R.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The use of quantum computing for machine learning is among the most exciting prospective applications of quantum technologies. However, machine learning tasks where data is provided can be considerably different than commonly studied computational tasks. In this work, we show that some problems that are classically hard to compute can be easily predicted by classical machines learning from data. Using rigorous prediction error bounds as a foundation, we develop a methodology for assessing potential quantum advantage in learning tasks. The bounds are tight asymptotically and empirically predictive for a wide range of learning models. These constructions explain numerical results showing that with the help of data, classical machine learning models can be competitive with quantum models even if they are tailored to quantum problems. We then propose a projected quantum model that provides a simple and rigorous quantum speed-up for a learning problem in the fault-tolerant regime. For near-term implementations, we demonstrate a significant prediction advantage over some classical models on engineered data sets designed to demonstrate a maximal quantum advantage in one of the largest numerical tests for gate-based quantum machine learning to date, up to 30 qubits. Expectations for quantum machine learning are high, but there is currently a lack of rigorous results on which scenarios would actually exhibit a quantum advantage. Here, the authors show how to tell, for a given dataset, whether a quantum model would give any prediction advantage over a classical one.
ISSN:2041-1723
2041-1723
DOI:10.1038/s41467-021-22539-9