Loading…

Linear Programming-Based Converses for Finite Blocklength Lossy Joint Source-Channel Coding

A linear programming (LP)-based framework is presented for obtaining converses for finite blocklength lossy joint source-channel coding problems. The framework applies for any loss criterion, generalizes certain previously known converses, and also extends to multi-terminal settings. The finite bloc...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2017-11, Vol.63 (11), p.7066-7094
Main Authors: Jose, Sharu Theresa, Kulkarni, Ankur A.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A linear programming (LP)-based framework is presented for obtaining converses for finite blocklength lossy joint source-channel coding problems. The framework applies for any loss criterion, generalizes certain previously known converses, and also extends to multi-terminal settings. The finite blocklength problem is posed equivalently as a nonconvex optimization problem and using a lift-and-project-like method, a close but tractable LP relaxation of this problem is derived. Lower bounds on the original problem are obtained by the construction of feasible points for the dual of the LP relaxation. A particular application of this approach leads to new converses, which recover and improve on the converses of Kostina and Verdú for finite blocklength lossy joint source-channel coding and lossy source coding. For finite blocklength channel coding, the LP relaxation recovers the converse of Polyanskiy, Poor and Verdú and leads to a new improvement on the converse of Wolfowitz, showing thereby that our LP relaxation is asymptotically tight with increasing blocklengths for channel coding, lossless source coding, and joint source-channel coding with the excess distortion probability as the loss criterion. Using a duality-based argument, a new converse is derived for finite blocklength joint source-channel coding for a class of source-channel pairs. Employing this converse, the LP relaxation is also shown to be tight for all blocklengths for the minimization of the expected average symbolwise Hamming distortion of a q-ary uniform source over a q-ary symmetric memoryless channel for any q ∈ N. The optimization formulation and the lift-and-project method are extended to networked settings and demonstrated by obtaining an improvement on a converse of Zhou et al. for the successive refinement problem for successively refinable source-distortion measure triplets.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2017.2738634