Loading…

A Rate-Splitting Approach to Fading Channels With Imperfect Channel-State Information

As shown by Médard, the capacity of fading channels with imperfect channel-state information can be lower-bounded by assuming a Gaussian channel input X with power P and by upper-bounding the conditional entropy h(X|Y, Ĥ) by the entropy of a Gaussian random variable with variance equal to the line...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2014-07, Vol.60 (7), p.4266-4285
Main Authors: Pastore, Adriano, Koch, Tobias, Rodriguez Fonollosa, Javier
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:As shown by Médard, the capacity of fading channels with imperfect channel-state information can be lower-bounded by assuming a Gaussian channel input X with power P and by upper-bounding the conditional entropy h(X|Y, Ĥ) by the entropy of a Gaussian random variable with variance equal to the linear minimum mean-square error in estimating X from (Y, Ĥ). We demonstrate that, using a rate-splitting approach, this lower bound can be sharpened: by expressing the Gaussian input X as the sum of two independent Gaussian variables X 1 and X 2 and by applying Médard's lower bound first to bound the mutual information between X 1 and Y while treating X 2 as noise, and by applying it a second time to the mutual information between X 2 and Y while assuming X 1 to be known, we obtain a capacity lower bound that is strictly larger than Médard's lower bound. We then generalize this approach to an arbitrary number L of layers, where X is expressed as the sum of L independent Gaussian random variables of respective variances P ℓ , ℓ = 1, ... , L summing up to P. Among all such rate-splitting bounds, we determine the supremum over power allocations P ℓ and total number of layers L. This supremum is achieved for L →∞ and gives rise to an analytically expressible capacity lower bound. For Gaussian fading, this novel bound is shown to converge to the Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows, provided that the variance of the channel estimation error H - Ĥ tends to zero as the SNR tends to infinity.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2014.2321567