Loading…

Relative complexity changes in time series using information measures

Estimation of complexity is of great interest in nonlinear signal and system analysis. Several complexity measures have been proposed: Lyapunov exponents, Lempel and Ziv, approximate entropy. In the present study, complexity measures derived from Shannon entropy, Harvda–Charvat–Daróvczy–Tsallis ( q-...

Full description

Saved in:
Bibliographic Details
Published in:Physica A 2000, Vol.286 (3), p.457-473
Main Authors: Torres, M.E., Gamero, L.G.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Estimation of complexity is of great interest in nonlinear signal and system analysis. Several complexity measures have been proposed: Lyapunov exponents, Lempel and Ziv, approximate entropy. In the present study, complexity measures derived from Shannon entropy, Harvda–Charvat–Daróvczy–Tsallis ( q-entropies) and their corresponding relative information measures are presented and evaluated in the context of nonlinear systems presenting abrupt complexity changes. The performance of the proposed measures in the presence of controlled complexity is evaluated through numerical experiments using nonlinear models. An example with heart rate variability signals is presented. The results obtained show that the entropic and the relative complexity measures approach allow to discern complexity changes in a similar qualitative way compared against classical techniques but with much less computational cost and less amount of data. In the presence of noise, the relative complexity measures behave as robust tools for relative complexity changes detection. Time-scale complexity analyses are presented using the continuous multiresolution entropies. The assessment of time-scale complexity changes is also discussed.
ISSN:0378-4371
1873-2119
DOI:10.1016/S0378-4371(00)00309-5