Loading…
Expurgated Random-Coding Ensembles: Exponents, Refinements, and Connections
This paper studies expurgated random-coding bounds and exponents for channel coding with a given (possibly suboptimal) decoding rule. Variations of Gallager's analysis are presented, yielding several asymptotic and nonasymptotic bounds on the error probability for an arbitrary codeword distribu...
Saved in:
Published in: | IEEE transactions on information theory 2014-08, Vol.60 (8), p.4449-4462 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper studies expurgated random-coding bounds and exponents for channel coding with a given (possibly suboptimal) decoding rule. Variations of Gallager's analysis are presented, yielding several asymptotic and nonasymptotic bounds on the error probability for an arbitrary codeword distribution. A simple nonasymptotic bound is shown to attain an exponent of Csiszár and Körner under constant-composition coding. Using Lagrange duality, this exponent is expressed in several forms, one of which is shown to permit a direct derivation via cost-constrained coding that extends to infinite and continuous alphabets. The method of type class enumeration is studied, and it is shown that this approach can yield improved exponents and better tightness guarantees for some codeword distributions. A generalization of this approach is shown to provide a multiletter exponent that extends immediately to channels with memory. |
---|---|
ISSN: | 0018-9448 1557-9654 |
DOI: | 10.1109/TIT.2014.2322033 |