Loading…

Jensen–Fisher information and Jensen–Shannon entropy measures based on complementary discrete distributions with an application to Conway’s game of life

Several information and divergence measures existing in the literature assist in measuring the knowledge contained in sources of information. Studying an information source from both positive and negative aspects will result in more accurate and comprehensive information. In many cases, extracting i...

Full description

Saved in:
Bibliographic Details
Published in:Physica. D 2023-11, Vol.453, p.133822, Article 133822
Main Authors: Kharazmi, Omid, Contreras-Reyes, Javier E., Balakrishnan, Narayanaswamy
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Several information and divergence measures existing in the literature assist in measuring the knowledge contained in sources of information. Studying an information source from both positive and negative aspects will result in more accurate and comprehensive information. In many cases, extracting information through the positive approach could be not an easy task while it may be feasible when dealing with the negative aspect. Negation is a new perspective and direction to quantify the information or knowledge in a given system from the negative approach. In this work, we study some new information measures, such as Fisher information, Fisher information distance, Jensen–Fisher information and Jensen–Shannon entropy measures, based on complementary distributions. We then show that the proposed Jensen–Fisher information measure can be expressed based on Fisher information distance measure. We have further shown that the Jensen–Shannon entropy measure has two representations in terms of Kullback–Leibler divergence and Jensen–extropy measures. Some illustrations related to complementary distribution of Bernoulli and Poisson random variables are then presented. Finally, for illustrative purpose, we have examined a real example on Conway’s game of life and have presented some numerical results in terms of the proposed information measures. •Fisher information based on complementary distribution is proposed.•Jensen–Fisher and Jensen–Shannon entropies are also derived.•Jensen–Fisher entropy can be expressed based on Fisher information distance measure.•Jensen–Shannon entropy is represented in terms of Kullback–Leibler divergence.•An application to Conway’s game of life is presented.
ISSN:0167-2789
1872-8022
DOI:10.1016/j.physd.2023.133822