Loading…

Inducing Information Stability to Obtain Information Theoretic Necessary Requirements

This work presents a new methodology for obtaining information theoretic necessary conditions directly from general operational requirements. This methodology is based on the construction of a discrete random variable that, when conditioned upon, ensures information stability of quasi-images. The in...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2020-02, Vol.66 (2), p.835-864
Main Authors: Graves, Eric, Wong, Tan F.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This work presents a new methodology for obtaining information theoretic necessary conditions directly from general operational requirements. This methodology is based on the construction of a discrete random variable that, when conditioned upon, ensures information stability of quasi-images. The induced information stability allows a more direct way to develop information theoretic necessary conditions from operational requirements beyond using Fano's inequality. That is, while Fano's inequality uses the probability of error to establish an upper bound on the entropy of a random variable given its estimator, the proposed new methodology can be applied to arbitrary operational requirements to obtain corresponding conditions on information theoretic quantities. To demonstrate its power, this new methodology is employed, to derive new necessary conditions for keyed authentication over a discrete memoryless channels and to establish the capacity region subject to finite leakage and finite error of the wiretap channel under two different secrecy metrics. These examples establish the usefulness of the proposed methodology.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2019.2942483