Loading…
Optimal context quantization in lossless compression of image data sequences
In image compression context-based entropy coding is commonly used. A critical issue to the performance of context-based image coding is how to resolve the conflict of a desire for large templates to model high-order statistic dependency of the pixels and the problem of context dilution due to insuf...
Saved in:
Published in: | IEEE transactions on image processing 2004-04, Vol.13 (4), p.509-517 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In image compression context-based entropy coding is commonly used. A critical issue to the performance of context-based image coding is how to resolve the conflict of a desire for large templates to model high-order statistic dependency of the pixels and the problem of context dilution due to insufficient sample statistics of a given input image. We consider the problem of finding the optimal quantizer Q that quantizes the K-dimensional causal context C/sub t/=(X/sub t-t1/,X/sub t-t2/,...,X/sub t-tK/) of a source symbol X/sub t/ into one of a set of conditioning states. The optimality of context quantization is defined to be the minimum static or minimum adaptive code length of given a data set. For a binary source alphabet an optimal context quantizer can be computed exactly by a fast dynamic programming algorithm. Faster approximation solutions are also proposed. In case of m-ary source alphabet a random variable can be decomposed into a sequence of binary decisions, each of which is coded using optimal context quantization designed for the corresponding binary random variable. This optimized coding scheme is applied to digital maps and /spl alpha/-plane sequences. The proposed optimal context quantization technique can also be used to establish a lower bound on the achievable code length, and hence is a useful tool to evaluate the performance of existing heuristic context quantizers. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2003.822613 |