Loading…
Quantum learning Boolean linear functions w.r.t. product distributions
The problem of learning Boolean linear functions from quantum examples w.r.t. the uniform distribution can be solved on a quantum computer using the Bernstein–Vazirani algorithm (Bernstein and Vazirani, in: Kosaraju (ed) Proceedings of the twenty-fifth annual ACM symposium on theory of computing, AC...
Saved in:
Published in: | Quantum information processing 2020-06, Vol.19 (6), Article 172 |
---|---|
Main Author: | |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c363t-97bc63e0acd9123a26a55bd414ddaceb50369ccf791eb842eeb242eef5c5cb063 |
---|---|
cites | cdi_FETCH-LOGICAL-c363t-97bc63e0acd9123a26a55bd414ddaceb50369ccf791eb842eeb242eef5c5cb063 |
container_end_page | |
container_issue | 6 |
container_start_page | |
container_title | Quantum information processing |
container_volume | 19 |
creator | Caro, Matthias C. |
description | The problem of learning Boolean linear functions from quantum examples w.r.t. the uniform distribution can be solved on a quantum computer using the Bernstein–Vazirani algorithm (Bernstein and Vazirani, in: Kosaraju (ed) Proceedings of the twenty-fifth annual ACM symposium on theory of computing, ACM, New York, 1993.
https://doi.org/10.1145/167088.167097
). A similar strategy can be applied in the case of noisy quantum training data, as was observed in Grilo et al. (Learning with errors is easy with quantum samples, 2017). However, extensions of these learning algorithms beyond the uniform distribution have not yet been studied. We employ the biased quantum Fourier transform introduced in Kanade et al. (Learning dnfs under product distributions via
μ
-biased quantum Fourier sampling, 2018) to develop efficient quantum algorithms for learning Boolean linear functions on
n
bits from quantum examples w.r.t. a biased product distribution. Our first procedure is applicable to any (except full) bias and requires
O
(
ln
(
n
)
)
quantum examples. The number of quantum examples used by our second algorithm is independent of
n
, but the strategy is applicable only for small bias. Moreover, we show that the second procedure is stable w.r.t. noisy training data and w.r.t. faulty quantum gates. This also enables us to solve a version of the learning problem in which the underlying distribution is not known in advance. Finally, we prove lower bounds on the classical and quantum sample complexities of the learning problem. Whereas classically,
Ω
(
n
)
examples are necessary independently of the bias, we are able to establish a quantum sample complexity lower bound of
Ω
(
ln
(
n
)
)
only under an assumption of large bias. Nevertheless, this allows for a discussion of the performance of our suggested learning algorithms w.r.t. sample complexity. With our analysis, we contribute to a more quantitative understanding of the power and limitations of quantum training data for learning classical functions. |
doi_str_mv | 10.1007/s11128-020-02661-1 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2392535080</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2392535080</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-97bc63e0acd9123a26a55bd414ddaceb50369ccf791eb842eeb242eef5c5cb063</originalsourceid><addsrcrecordid>eNp9UE1LAzEQDaJgrf4BTwHPu2aSTXb3qMVWoSCCnkOSzZYtbbbmA_Hfm3YFbx7mg5n33gwPoVsgJRBS3wcAoE1BKMkhBBRwhmbAa1YAY_T81OdVzfklugphSwgF0YgZWr4l5WLa451V3g1ugx_HMfcO7waXR7hPzsRhdAF_lb6MJT74sUsm4m4I0Q86nZbX6KJXu2BvfuscfSyf3hfPxfp19bJ4WBeGCRaLttZGMEuU6VqgTFGhONddBVXXKWM1J0y0xvR1C1Y3FbVW02PuueFGE8Hm6G7SzV98Jhui3I7Ju3xSUtZSzjhpSEbRCWX8GIK3vTz4Ya_8twQij37JyS-Z_ZInvyRkEptIIYPdxvo_6X9YP9B5bms</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2392535080</pqid></control><display><type>article</type><title>Quantum learning Boolean linear functions w.r.t. product distributions</title><source>Springer Link</source><creator>Caro, Matthias C.</creator><creatorcontrib>Caro, Matthias C.</creatorcontrib><description>The problem of learning Boolean linear functions from quantum examples w.r.t. the uniform distribution can be solved on a quantum computer using the Bernstein–Vazirani algorithm (Bernstein and Vazirani, in: Kosaraju (ed) Proceedings of the twenty-fifth annual ACM symposium on theory of computing, ACM, New York, 1993.
https://doi.org/10.1145/167088.167097
). A similar strategy can be applied in the case of noisy quantum training data, as was observed in Grilo et al. (Learning with errors is easy with quantum samples, 2017). However, extensions of these learning algorithms beyond the uniform distribution have not yet been studied. We employ the biased quantum Fourier transform introduced in Kanade et al. (Learning dnfs under product distributions via
μ
-biased quantum Fourier sampling, 2018) to develop efficient quantum algorithms for learning Boolean linear functions on
n
bits from quantum examples w.r.t. a biased product distribution. Our first procedure is applicable to any (except full) bias and requires
O
(
ln
(
n
)
)
quantum examples. The number of quantum examples used by our second algorithm is independent of
n
, but the strategy is applicable only for small bias. Moreover, we show that the second procedure is stable w.r.t. noisy training data and w.r.t. faulty quantum gates. This also enables us to solve a version of the learning problem in which the underlying distribution is not known in advance. Finally, we prove lower bounds on the classical and quantum sample complexities of the learning problem. Whereas classically,
Ω
(
n
)
examples are necessary independently of the bias, we are able to establish a quantum sample complexity lower bound of
Ω
(
ln
(
n
)
)
only under an assumption of large bias. Nevertheless, this allows for a discussion of the performance of our suggested learning algorithms w.r.t. sample complexity. With our analysis, we contribute to a more quantitative understanding of the power and limitations of quantum training data for learning classical functions.</description><identifier>ISSN: 1570-0755</identifier><identifier>EISSN: 1573-1332</identifier><identifier>DOI: 10.1007/s11128-020-02661-1</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Bias ; Boolean ; Boolean algebra ; Complexity ; Data Structures and Information Theory ; Fourier transforms ; Linear functions ; Lower bounds ; Machine learning ; Mathematical Physics ; Physics ; Physics and Astronomy ; Quantum computers ; Quantum Computing ; Quantum Information Technology ; Quantum Physics ; Spintronics ; Training</subject><ispartof>Quantum information processing, 2020-06, Vol.19 (6), Article 172</ispartof><rights>The Author(s) 2020</rights><rights>The Author(s) 2020. This work is published under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c363t-97bc63e0acd9123a26a55bd414ddaceb50369ccf791eb842eeb242eef5c5cb063</citedby><cites>FETCH-LOGICAL-c363t-97bc63e0acd9123a26a55bd414ddaceb50369ccf791eb842eeb242eef5c5cb063</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Caro, Matthias C.</creatorcontrib><title>Quantum learning Boolean linear functions w.r.t. product distributions</title><title>Quantum information processing</title><addtitle>Quantum Inf Process</addtitle><description>The problem of learning Boolean linear functions from quantum examples w.r.t. the uniform distribution can be solved on a quantum computer using the Bernstein–Vazirani algorithm (Bernstein and Vazirani, in: Kosaraju (ed) Proceedings of the twenty-fifth annual ACM symposium on theory of computing, ACM, New York, 1993.
https://doi.org/10.1145/167088.167097
). A similar strategy can be applied in the case of noisy quantum training data, as was observed in Grilo et al. (Learning with errors is easy with quantum samples, 2017). However, extensions of these learning algorithms beyond the uniform distribution have not yet been studied. We employ the biased quantum Fourier transform introduced in Kanade et al. (Learning dnfs under product distributions via
μ
-biased quantum Fourier sampling, 2018) to develop efficient quantum algorithms for learning Boolean linear functions on
n
bits from quantum examples w.r.t. a biased product distribution. Our first procedure is applicable to any (except full) bias and requires
O
(
ln
(
n
)
)
quantum examples. The number of quantum examples used by our second algorithm is independent of
n
, but the strategy is applicable only for small bias. Moreover, we show that the second procedure is stable w.r.t. noisy training data and w.r.t. faulty quantum gates. This also enables us to solve a version of the learning problem in which the underlying distribution is not known in advance. Finally, we prove lower bounds on the classical and quantum sample complexities of the learning problem. Whereas classically,
Ω
(
n
)
examples are necessary independently of the bias, we are able to establish a quantum sample complexity lower bound of
Ω
(
ln
(
n
)
)
only under an assumption of large bias. Nevertheless, this allows for a discussion of the performance of our suggested learning algorithms w.r.t. sample complexity. With our analysis, we contribute to a more quantitative understanding of the power and limitations of quantum training data for learning classical functions.</description><subject>Algorithms</subject><subject>Bias</subject><subject>Boolean</subject><subject>Boolean algebra</subject><subject>Complexity</subject><subject>Data Structures and Information Theory</subject><subject>Fourier transforms</subject><subject>Linear functions</subject><subject>Lower bounds</subject><subject>Machine learning</subject><subject>Mathematical Physics</subject><subject>Physics</subject><subject>Physics and Astronomy</subject><subject>Quantum computers</subject><subject>Quantum Computing</subject><subject>Quantum Information Technology</subject><subject>Quantum Physics</subject><subject>Spintronics</subject><subject>Training</subject><issn>1570-0755</issn><issn>1573-1332</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp9UE1LAzEQDaJgrf4BTwHPu2aSTXb3qMVWoSCCnkOSzZYtbbbmA_Hfm3YFbx7mg5n33gwPoVsgJRBS3wcAoE1BKMkhBBRwhmbAa1YAY_T81OdVzfklugphSwgF0YgZWr4l5WLa451V3g1ugx_HMfcO7waXR7hPzsRhdAF_lb6MJT74sUsm4m4I0Q86nZbX6KJXu2BvfuscfSyf3hfPxfp19bJ4WBeGCRaLttZGMEuU6VqgTFGhONddBVXXKWM1J0y0xvR1C1Y3FbVW02PuueFGE8Hm6G7SzV98Jhui3I7Ju3xSUtZSzjhpSEbRCWX8GIK3vTz4Ya_8twQij37JyS-Z_ZInvyRkEptIIYPdxvo_6X9YP9B5bms</recordid><startdate>20200601</startdate><enddate>20200601</enddate><creator>Caro, Matthias C.</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20200601</creationdate><title>Quantum learning Boolean linear functions w.r.t. product distributions</title><author>Caro, Matthias C.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-97bc63e0acd9123a26a55bd414ddaceb50369ccf791eb842eeb242eef5c5cb063</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Bias</topic><topic>Boolean</topic><topic>Boolean algebra</topic><topic>Complexity</topic><topic>Data Structures and Information Theory</topic><topic>Fourier transforms</topic><topic>Linear functions</topic><topic>Lower bounds</topic><topic>Machine learning</topic><topic>Mathematical Physics</topic><topic>Physics</topic><topic>Physics and Astronomy</topic><topic>Quantum computers</topic><topic>Quantum Computing</topic><topic>Quantum Information Technology</topic><topic>Quantum Physics</topic><topic>Spintronics</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Caro, Matthias C.</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><jtitle>Quantum information processing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Caro, Matthias C.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Quantum learning Boolean linear functions w.r.t. product distributions</atitle><jtitle>Quantum information processing</jtitle><stitle>Quantum Inf Process</stitle><date>2020-06-01</date><risdate>2020</risdate><volume>19</volume><issue>6</issue><artnum>172</artnum><issn>1570-0755</issn><eissn>1573-1332</eissn><abstract>The problem of learning Boolean linear functions from quantum examples w.r.t. the uniform distribution can be solved on a quantum computer using the Bernstein–Vazirani algorithm (Bernstein and Vazirani, in: Kosaraju (ed) Proceedings of the twenty-fifth annual ACM symposium on theory of computing, ACM, New York, 1993.
https://doi.org/10.1145/167088.167097
). A similar strategy can be applied in the case of noisy quantum training data, as was observed in Grilo et al. (Learning with errors is easy with quantum samples, 2017). However, extensions of these learning algorithms beyond the uniform distribution have not yet been studied. We employ the biased quantum Fourier transform introduced in Kanade et al. (Learning dnfs under product distributions via
μ
-biased quantum Fourier sampling, 2018) to develop efficient quantum algorithms for learning Boolean linear functions on
n
bits from quantum examples w.r.t. a biased product distribution. Our first procedure is applicable to any (except full) bias and requires
O
(
ln
(
n
)
)
quantum examples. The number of quantum examples used by our second algorithm is independent of
n
, but the strategy is applicable only for small bias. Moreover, we show that the second procedure is stable w.r.t. noisy training data and w.r.t. faulty quantum gates. This also enables us to solve a version of the learning problem in which the underlying distribution is not known in advance. Finally, we prove lower bounds on the classical and quantum sample complexities of the learning problem. Whereas classically,
Ω
(
n
)
examples are necessary independently of the bias, we are able to establish a quantum sample complexity lower bound of
Ω
(
ln
(
n
)
)
only under an assumption of large bias. Nevertheless, this allows for a discussion of the performance of our suggested learning algorithms w.r.t. sample complexity. With our analysis, we contribute to a more quantitative understanding of the power and limitations of quantum training data for learning classical functions.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s11128-020-02661-1</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1570-0755 |
ispartof | Quantum information processing, 2020-06, Vol.19 (6), Article 172 |
issn | 1570-0755 1573-1332 |
language | eng |
recordid | cdi_proquest_journals_2392535080 |
source | Springer Link |
subjects | Algorithms Bias Boolean Boolean algebra Complexity Data Structures and Information Theory Fourier transforms Linear functions Lower bounds Machine learning Mathematical Physics Physics Physics and Astronomy Quantum computers Quantum Computing Quantum Information Technology Quantum Physics Spintronics Training |
title | Quantum learning Boolean linear functions w.r.t. product distributions |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T06%3A11%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Quantum%20learning%20Boolean%20linear%20functions%20w.r.t.%20product%20distributions&rft.jtitle=Quantum%20information%20processing&rft.au=Caro,%20Matthias%20C.&rft.date=2020-06-01&rft.volume=19&rft.issue=6&rft.artnum=172&rft.issn=1570-0755&rft.eissn=1573-1332&rft_id=info:doi/10.1007/s11128-020-02661-1&rft_dat=%3Cproquest_cross%3E2392535080%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c363t-97bc63e0acd9123a26a55bd414ddaceb50369ccf791eb842eeb242eef5c5cb063%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2392535080&rft_id=info:pmid/&rfr_iscdi=true |