Loading…

Least Squares Superposition Codes With Bernoulli Dictionary are Still Reliable at Rates up to Capacity

For the additive white Gaussian noise channel with average power constraint, sparse superposition codes with least squares decoding are proposed by Barron and Joseph in 2010. The codewords are designed by using a dictionary each entry of which is drawn from a Gaussian distribution. The error probabi...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2014-05, Vol.60 (5), p.2737-2750
Main Authors: Takeishi, Yoshinari, Kawakita, Masanori, Takeuchi, Jun'ichi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:For the additive white Gaussian noise channel with average power constraint, sparse superposition codes with least squares decoding are proposed by Barron and Joseph in 2010. The codewords are designed by using a dictionary each entry of which is drawn from a Gaussian distribution. The error probability is shown to be exponentially small for all rates up to the capacity. This paper proves that when each entry of the dictionary is drawn from a Bernoulli distribution, the error probability is also exponentially small for all rates up to the capacity. The proof is via a central limit theorem-type inequality, which we show for this analysis.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2014.2312728