Loading…
Regularized Divergences Between Covariance Operators and Gaussian Measures on Hilbert Spaces
This work presents an infinite-dimensional generalization of the correspondence between the Kullback–Leibler and Rényi divergences between Gaussian measures on Euclidean space and the Alpha Log-Determinant divergences between symmetric, positive definite matrices. Specifically, we present the regula...
Saved in:
Published in: | Journal of theoretical probability 2021, Vol.34 (2), p.580-643 |
---|---|
Main Author: | |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This work presents an infinite-dimensional generalization of the correspondence between the Kullback–Leibler and Rényi divergences between Gaussian measures on Euclidean space and the Alpha Log-Determinant divergences between symmetric, positive definite matrices. Specifically, we present the regularized Kullback–Leibler and Rényi divergences between covariance operators and Gaussian measures on an infinite-dimensional Hilbert space, which are defined using the infinite-dimensional Alpha Log-Determinant divergences between positive definite trace class operators. We show that, as the regularization parameter approaches zero, the regularized Kullback–Leibler and Rényi divergences between two equivalent Gaussian measures on a Hilbert space converge to the corresponding true divergences. The explicit formulas for the divergences involved are presented in the most general Gaussian setting. |
---|---|
ISSN: | 0894-9840 1572-9230 |
DOI: | 10.1007/s10959-020-01003-2 |