Loading…

Bayesian Error-Based Sequences of Statistical Information Bounds

The relation between statistical information and Bayesian error is sharpened by deriving finite sequences of upper and lower bounds on equivocation entropy (EE) in terms of the minimum probability of error (MPE) and related Bayesian quantities. The well-known Fano upper bound and Feder-Merhav lower...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2015-09, Vol.61 (9), p.5052-5062
Main Author: Prasad, Sudhaker
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The relation between statistical information and Bayesian error is sharpened by deriving finite sequences of upper and lower bounds on equivocation entropy (EE) in terms of the minimum probability of error (MPE) and related Bayesian quantities. The well-known Fano upper bound and Feder-Merhav lower bound on EE are tightened by including a succession of posterior probabilities starting at the largest, which directly controls the MPE, and proceeding to successively lower ones. A number of other interesting results are also derived, including a sequence of upper bounds on the MPE in terms of a previously introduced sequence of generalized posterior distributions. The tightness of the various bounds is numerically evaluated for a simple example.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2015.2457913