Loading…

On Computing Jeffrey's Divergence Between Time-Varying Autoregressive Models

Autoregressive (AR) and time-varying AR (TVAR) models are widely used in various applications, from speech processing to biomedical signal analysis. Various dissimilarity measures such as the Itakura divergence have been proposed to compare two AR models. However, they do not take into account the v...

Full description

Saved in:
Bibliographic Details
Published in:IEEE signal processing letters 2015-07, Vol.22 (7), p.915-919
Main Authors: Magnant, Clement, Giremus, Audrey, Grivel, Eric
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Autoregressive (AR) and time-varying AR (TVAR) models are widely used in various applications, from speech processing to biomedical signal analysis. Various dissimilarity measures such as the Itakura divergence have been proposed to compare two AR models. However, they do not take into account the variances of the driving processes and only apply to stationary processes. More generally, the comparison between Gaussian processes is based on the Kullback-Leibler (KL) divergence but only asymptotic expressions are classically used. In this letter, we suggest analyzing the similarities of two TVAR models, sample after sample, by recursively computing the Jeffrey's divergence between the joint distributions of the successive values of each TVAR model. Then, we show that, under some assumptions, this divergence tends to the Itakura divergence in the stationary case.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2014.2377473