Loading…
Erasable Virtual HyperLogLog for Approximating Cumulative Distribution over Data Streams
Many real-world datasets are given in the stream of entity-identifier pairs, and measuring data distribution on these datasets is fundamental for applications such as privacy protection. In this paper, we study the problem of computing the cumulative distribution for different cardinalities (i.e., t...
Saved in:
Published in: | IEEE transactions on knowledge and data engineering 2022-11, Vol.34 (11), p.5336-5350 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Many real-world datasets are given in the stream of entity-identifier pairs, and measuring data distribution on these datasets is fundamental for applications such as privacy protection. In this paper, we study the problem of computing the cumulative distribution for different cardinalities (i.e., the number of distinct entities owning the same identifier). However, previous sketch-based methods cost large memory space especially when there are a large number of identifiers, and sampling-based methods require much time for cardinality estimation. A recent work KHyperLogLog combines both sketch and sampling methods but it is wasteful to separately build a HyperLogLog sketch of large size for identifiers with small cardinalities. To address these challenges, we propose a memory-efficient method EV-HLL, which designs a shared structure to store all sampled identifiers and their entities and utilizes additional sketches to track value updates during the sampling procedure. Meanwhile, EV-HLL provides real-time unbiased estimations according to value changes whenever a new entity-identifier pair arrives. We evaluate the performance of EV-HLL and other state-of-the-arts on real-world available datasets. Experimental results demonstrate that comparing to other methods, EV-HLL effectively reduces their memory usage with the same estimation accuracy and has higher accuracy with the same memory usage. |
---|---|
ISSN: | 1041-4347 1558-2191 |
DOI: | 10.1109/TKDE.2021.3052938 |