Loading…

Entropy Estimators with Almost Sure Convergence and an O(n-1) Variance

The problem of the estimation of the entropy rate of a stationary ergodic process mu is considered. A new nonparametric entropy rate estimator is constructed for a sample of n sequences (X 1 (1) ,...,X m (1) ),..., (X n (1) ,...,X m (n) ) independently generated by mu. It is shown that, for m = O(lo...

Full description

Saved in:
Bibliographic Details
Main Authors: Kaltchenko, A., En-hui Yang, Timofeeva, N.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The problem of the estimation of the entropy rate of a stationary ergodic process mu is considered. A new nonparametric entropy rate estimator is constructed for a sample of n sequences (X 1 (1) ,...,X m (1) ),..., (X n (1) ,...,X m (n) ) independently generated by mu. It is shown that, for m = O(log n), the estimator converges almost surely and its variance is upper-bounded by O(n -1 ) for a large class of stationary ergodic processes with a finite state space. As the order O(n -1 ) of the variance growth on n is the same as that of the optimal Cramer-Rao lower bound, presented is the first near-optimal estimator in the sense of the variance convergence.
DOI:10.1109/ITW.2007.4313150