Loading…
Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective
We introduces the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with classical machine learning algorithms to address significant challenges in data encoding, model compression, and inference hardware requirements. Even with a slight decrease in accuracy, QT achieve...
Saved in:
Published in: | arXiv.org 2024-06 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Chen-Yu, Liu En-Jui Kuo Chu-Hsuan Abraham Lin Young, Jason Gemsun Chang, Yeong-Jar Hsieh, Min-Hsiu Hsi-Sheng Goan |
description | We introduces the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with classical machine learning algorithms to address significant challenges in data encoding, model compression, and inference hardware requirements. Even with a slight decrease in accuracy, QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model, which significantly reduces the parameter count from \(M\) to \(O(\text{polylog} (M))\) during training. Our experiments demonstrate QT's effectiveness in classification tasks, offering insights into its potential to revolutionize machine learning by leveraging quantum computational advantages. This approach not only improves model efficiency but also reduces generalization errors, showcasing QT's potential across various machine learning applications. |
format | article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3057514615</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3057514615</sourcerecordid><originalsourceid>FETCH-proquest_journals_30575146153</originalsourceid><addsrcrecordid>eNqNyr0KwjAUQOEgCIr2HS44F9rEtOJaFAcFFXeJ7dWmpknNj-DbW0F3pzOcb0DGlLE0XswpHZHIuSZJEprllHM2JvUhCO1DG5-skHoJR_S11Hepb7B5Xays4AcKJZyTpVCwE2VvELYorP5IqcHXCDtToYLCtJ3FnhoNe7Suw9LLJ07J8CqUw-jbCZmtV6diE3fWPAI6f25MsLpfZ5bwnKfzLOXsP_UGWbtH7Q</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3057514615</pqid></control><display><type>article</type><title>Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective</title><source>Publicly Available Content Database</source><creator>Chen-Yu, Liu ; En-Jui Kuo ; Chu-Hsuan Abraham Lin ; Young, Jason Gemsun ; Chang, Yeong-Jar ; Hsieh, Min-Hsiu ; Hsi-Sheng Goan</creator><creatorcontrib>Chen-Yu, Liu ; En-Jui Kuo ; Chu-Hsuan Abraham Lin ; Young, Jason Gemsun ; Chang, Yeong-Jar ; Hsieh, Min-Hsiu ; Hsi-Sheng Goan</creatorcontrib><description>We introduces the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with classical machine learning algorithms to address significant challenges in data encoding, model compression, and inference hardware requirements. Even with a slight decrease in accuracy, QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model, which significantly reduces the parameter count from \(M\) to \(O(\text{polylog} (M))\) during training. Our experiments demonstrate QT's effectiveness in classification tasks, offering insights into its potential to revolutionize machine learning by leveraging quantum computational advantages. This approach not only improves model efficiency but also reduces generalization errors, showcasing QT's potential across various machine learning applications.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Error reduction ; Machine learning ; Neural networks ; Quantum computing</subject><ispartof>arXiv.org, 2024-06</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/3057514615?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Chen-Yu, Liu</creatorcontrib><creatorcontrib>En-Jui Kuo</creatorcontrib><creatorcontrib>Chu-Hsuan Abraham Lin</creatorcontrib><creatorcontrib>Young, Jason Gemsun</creatorcontrib><creatorcontrib>Chang, Yeong-Jar</creatorcontrib><creatorcontrib>Hsieh, Min-Hsiu</creatorcontrib><creatorcontrib>Hsi-Sheng Goan</creatorcontrib><title>Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective</title><title>arXiv.org</title><description>We introduces the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with classical machine learning algorithms to address significant challenges in data encoding, model compression, and inference hardware requirements. Even with a slight decrease in accuracy, QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model, which significantly reduces the parameter count from \(M\) to \(O(\text{polylog} (M))\) during training. Our experiments demonstrate QT's effectiveness in classification tasks, offering insights into its potential to revolutionize machine learning by leveraging quantum computational advantages. This approach not only improves model efficiency but also reduces generalization errors, showcasing QT's potential across various machine learning applications.</description><subject>Algorithms</subject><subject>Error reduction</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Quantum computing</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNyr0KwjAUQOEgCIr2HS44F9rEtOJaFAcFFXeJ7dWmpknNj-DbW0F3pzOcb0DGlLE0XswpHZHIuSZJEprllHM2JvUhCO1DG5-skHoJR_S11Hepb7B5Xays4AcKJZyTpVCwE2VvELYorP5IqcHXCDtToYLCtJ3FnhoNe7Suw9LLJ07J8CqUw-jbCZmtV6diE3fWPAI6f25MsLpfZ5bwnKfzLOXsP_UGWbtH7Q</recordid><startdate>20240610</startdate><enddate>20240610</enddate><creator>Chen-Yu, Liu</creator><creator>En-Jui Kuo</creator><creator>Chu-Hsuan Abraham Lin</creator><creator>Young, Jason Gemsun</creator><creator>Chang, Yeong-Jar</creator><creator>Hsieh, Min-Hsiu</creator><creator>Hsi-Sheng Goan</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240610</creationdate><title>Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective</title><author>Chen-Yu, Liu ; En-Jui Kuo ; Chu-Hsuan Abraham Lin ; Young, Jason Gemsun ; Chang, Yeong-Jar ; Hsieh, Min-Hsiu ; Hsi-Sheng Goan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_30575146153</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Error reduction</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Quantum computing</topic><toplevel>online_resources</toplevel><creatorcontrib>Chen-Yu, Liu</creatorcontrib><creatorcontrib>En-Jui Kuo</creatorcontrib><creatorcontrib>Chu-Hsuan Abraham Lin</creatorcontrib><creatorcontrib>Young, Jason Gemsun</creatorcontrib><creatorcontrib>Chang, Yeong-Jar</creatorcontrib><creatorcontrib>Hsieh, Min-Hsiu</creatorcontrib><creatorcontrib>Hsi-Sheng Goan</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chen-Yu, Liu</au><au>En-Jui Kuo</au><au>Chu-Hsuan Abraham Lin</au><au>Young, Jason Gemsun</au><au>Chang, Yeong-Jar</au><au>Hsieh, Min-Hsiu</au><au>Hsi-Sheng Goan</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective</atitle><jtitle>arXiv.org</jtitle><date>2024-06-10</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>We introduces the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with classical machine learning algorithms to address significant challenges in data encoding, model compression, and inference hardware requirements. Even with a slight decrease in accuracy, QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model, which significantly reduces the parameter count from \(M\) to \(O(\text{polylog} (M))\) during training. Our experiments demonstrate QT's effectiveness in classification tasks, offering insights into its potential to revolutionize machine learning by leveraging quantum computational advantages. This approach not only improves model efficiency but also reduces generalization errors, showcasing QT's potential across various machine learning applications.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-06 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_3057514615 |
source | Publicly Available Content Database |
subjects | Algorithms Error reduction Machine learning Neural networks Quantum computing |
title | Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-30T20%3A05%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Quantum-Train:%20Rethinking%20Hybrid%20Quantum-Classical%20Machine%20Learning%20in%20the%20Model%20Compression%20Perspective&rft.jtitle=arXiv.org&rft.au=Chen-Yu,%20Liu&rft.date=2024-06-10&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3057514615%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_30575146153%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3057514615&rft_id=info:pmid/&rfr_iscdi=true |