Loading…
Optimizing Secure Decision Tree Inference Outsourcing
Outsourcing decision tree inference services to the cloud is highly beneficial, yet raises critical privacy concerns on the proprietary decision tree of the model provider and the private input data of the client. In this paper, we design, implement, and evaluate a new system that allows highly effi...
Saved in:
Published in: | IEEE transactions on dependable and secure computing 2023-07, Vol.20 (4), p.3079-3092 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Outsourcing decision tree inference services to the cloud is highly beneficial, yet raises critical privacy concerns on the proprietary decision tree of the model provider and the private input data of the client. In this paper, we design, implement, and evaluate a new system that allows highly efficient outsourcing of decision tree inference. Our system significantly improves upon prior art in the overall online end-to-end secure inference service latency at the cloud as well as the local-side performance of the model provider. We first present a new scheme which securely shifts most of the processing of the model provider to the cloud, resulting in a substantial reduction on the model provider's performance complexities. We further devise a scheme which substantially optimizes the performance for secure decision tree inference at the cloud, particularly the communication round complexities. The synergy of these techniques allows our new system to achieve up to 8 \times 8× better overall online end-to-end secure inference latency at the cloud side over realistic WAN environment, as well as bring the model provider up to 19 \times 19× savings in communication and 18 \times 18× savings in computation. |
---|---|
ISSN: | 1545-5971 1941-0018 |
DOI: | 10.1109/TDSC.2022.3194048 |