Loading…

Efficiently updating and tracking the dominant kernel principal components

The dominant set of eigenvectors of the symmetrical kernel Gram matrix is used in many important kernel methods (like e.g. kernel Principal Component Analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data,...

Full description

Saved in:
Bibliographic Details
Published in:Neural networks 2007-03, Vol.20 (2), p.220-229
Main Authors: Hoegaerts, L., De Lathauwer, L., Goethals, I., Suykens, J.A.K., Vandewalle, J., De Moor, B.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The dominant set of eigenvectors of the symmetrical kernel Gram matrix is used in many important kernel methods (like e.g. kernel Principal Component Analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data, the batch calculation nature and computational demands of the eigenvector decomposition limit these methods in numerous applications. In this paper we present an efficient incremental approach for fast calculation of the dominant kernel eigenbasis, which allows us to track the kernel eigenspace dynamically. Experiments show that our updating scheme delivers a numerically stable and accurate approximation for eigenvalues and eigenvectors at every iteration in comparison to the batch algorithm.
ISSN:0893-6080
1879-2782
DOI:10.1016/j.neunet.2006.09.012