Loading…

On Properties of the Support of Capacity-Achieving Distributions for Additive Noise Channel Models With Input Cost Constraints

We study the classical problem of characterizing the channel capacity and its achieving distribution in a generic fashion. We derive a simple relation between three parameters: the input-output function, the input cost function, and the noise probability density function, one which dictates the type...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2018-02, Vol.64 (2), p.1178-1198
Main Authors: Fahs, Jihad, Abou-Faycal, Ibrahim
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We study the classical problem of characterizing the channel capacity and its achieving distribution in a generic fashion. We derive a simple relation between three parameters: the input-output function, the input cost function, and the noise probability density function, one which dictates the type of the optimal input. In layman terms, we prove that the support of the optimal input is bounded whenever the cost grows faster than a "cutoff" growth rate equal to the logarithm of the inverse of the noise probability density function evaluated at the input-output function. Furthermore, we prove a converse statement that says whenever the cost grows slower than the "cutoff" rate, the optimal input has necessarily an unbounded support. In addition, we show how the discreteness of the optimal input is guaranteed whenever the triplet satisfy some analyticity properties. We argue that a suitable cost function to be imposed on the channel input is one that grows similarly to the "cutoff" rate. Our results are valid for any cost function that is super-logarithmic. They summarize a large number of previous channel capacity results and give new ones for a wide range of communication channel models, such as Gaussian mixtures, generalized-Gaussians, and heavy-tailed noise models, that we state along with numerical computations.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2017.2771815