Loading…
Approximate Log-Determinant Divergences Between Covariance Operators and Applications
Covariance matrices and covariance operators have been playing increasingly important roles in numerous applications in machine learning, computer vision, image and signal processing. An active current research direction on covariance matrices and operators involves the exploitation of their intrins...
Saved in:
Main Author: | |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Covariance matrices and covariance operators have been playing increasingly important roles in numerous applications in machine learning, computer vision, image and signal processing. An active current research direction on covariance matrices and operators involves the exploitation of their intrinsic non-Euclidean geometrical structures for optimal practical performance. In this work, we consider the Log-Determinant divergences, which is a family of parametrized divergences encompassing many different divergences and distances between covariance matrices and operators, including the affine-invariant Riemannian distance and symmetric Stein divergence. In particular, we present finite-dimensional approximations of the infinite-dimensional Log-Determinant divergences between covariance operators, which consistently estimate the exact versions and at the same time can be substantially more efficient to compute. Computationally, we focus on covariance operators in reproducing kernel Hilbert spaces. For the Hellinger distance, defined using the symmetric Stein divergence, we obtain a two-layer kernel machine defined using both the mean vector and covariance operator. The theoretical formulation is accompanied by numerical experiments in computer vision. |
---|---|
ISSN: | 2576-2303 |
DOI: | 10.1109/IEEECONF44664.2019.9049042 |