Loading…

ISABELA for effective in situ compression of scientific data

SUMMARYExploding dataset sizes from extreme‐scale scientific simulations necessitates efficient data management and reduction schemes to mitigate I/O costs. With the discrepancy between I/O bandwidth and computational power, scientists are forced to capture data infrequently, thereby making data col...

Full description

Saved in:
Bibliographic Details
Published in:Concurrency and computation 2013-02, Vol.25 (4), p.524-540
Main Authors: Lakshminarasimhan, Sriram, Shah, Neil, Ethier, Stephane, Ku, Seung-Hoe, Chang, C. S., Klasky, Scott, Latham, Rob, Ross, Rob, Samatova, Nagiza F.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:SUMMARYExploding dataset sizes from extreme‐scale scientific simulations necessitates efficient data management and reduction schemes to mitigate I/O costs. With the discrepancy between I/O bandwidth and computational power, scientists are forced to capture data infrequently, thereby making data collection an inherently lossy process. Although data compression can be an effective solution, the random nature of real‐valued scientific datasets renders lossless compression routines ineffective. These techniques also impose significant overhead during decompression, making them unsuitable for data analysis and visualization, which require repeated data access.To address this problem, we propose an effective method for In situ Sort‐And‐B‐spline Error‐bounded Lossy Abatement (ISABELA) of scientific data that is widely regarded as effectively incompressible. With ISABELA, we apply a pre‐conditioner to seemingly random and noisy data along spatial resolution to achieve an accurate fitting model that guarantees a ⩾0.99 correlation with the original data. We further take advantage of temporal patterns in scientific data to compress data by ≈ 85%, while introducing only a negligible overhead on simulations in terms of runtime. ISABELA significantly outperforms existing lossy compression methods, such as wavelet compression, in terms of data reduction and accuracy.We extend upon our previous paper by additionally building a communication‐free, scalable parallel storage framework on top of ISABELA‐compressed data that is ideally suited for extreme‐scale analytical processing. The basis for our storage framework is an inherently local decompression method (it need not decode the entire data), which allows for random access decompression and low‐overhead task division that can be exploited over heterogeneous architectures. Furthermore, analytical operations such as correlation and query processing run quickly and accurately over data in the compressed space. Copyright © 2012 John Wiley & Sons, Ltd.
ISSN:1532-0626
1532-0634
DOI:10.1002/cpe.2887