Loading…

An Improved Quantum Algorithm for Ridge Regression

Ridge regression (RR) is an important machine learning technique which introduces a regularization hyperparameter \alpha α to ordinary multiple linear regression for analyzing data suffering from multicollinearity. In this paper, we present a quantum algorithm for RR, where the technique of paralle...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on knowledge and data engineering 2021-03, Vol.33 (3), p.858-866
Main Authors: Yu, Chao-Hua, Gao, Fei, Wen, Qiao-Yan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ridge regression (RR) is an important machine learning technique which introduces a regularization hyperparameter \alpha α to ordinary multiple linear regression for analyzing data suffering from multicollinearity. In this paper, we present a quantum algorithm for RR, where the technique of parallel Hamiltonian simulation to simulate a number of Hermitian matrices in parallel is proposed and used to develop a quantum version of K K -fold cross-validation approach, which can efficiently estimate the predictive performance of RR. Our algorithm consists of two phases: (1) using quantum K K -fold cross-validation to efficiently determine a good \alpha α with which RR can achieve good predictive performance, and then (2) generating a quantum state encoding the optimal fitting parameters of RR with such \alpha α , which can be further utilized to predict new data. Since indefinite dense Hamiltonian simulation has been adopted as a key subroutine, our algorithm can efficiently handle non-sparse data matrices. It is shown that our algorithm can achieve exponential speedup over the classical counterpart for (low-rank) data matrices with low condition numbers. But when the condition numbers of data matrices are large to be amenable to full or approximately full ranks of data matrices, only polynomial speedup can be achieved.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2019.2937491