Loading…

Efficient kernel discriminative common vectors for classification

Kernel discriminant analysis (KDA) which operates in the reproducing kernel Hilbert space (RKHS) is a very popular approach to dimensionality reduction. Kernel discriminative common vectors (KDCV) shares the same modified Fisher linear discriminant criterion with KDA and guarantees a 100 % recogniti...

Full description

Saved in:
Bibliographic Details
Published in:The Visual computer 2015-05, Vol.31 (5), p.643-655
Main Authors: Zheng, Jianwei, Huang, Qiongfang, Chen, Shengyong, Wang, Wanliang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Kernel discriminant analysis (KDA) which operates in the reproducing kernel Hilbert space (RKHS) is a very popular approach to dimensionality reduction. Kernel discriminative common vectors (KDCV) shares the same modified Fisher linear discriminant criterion with KDA and guarantees a 100 % recognition rate for the training set samples as well as favorable generalization performance. However, KDCV has the disadvantage of high computational complexity in both the training and the testing stage. This paper attempts to improve the computation efficiency of KDCV by two strategies. First, the Cholesky decomposition is introduced to obtain the projection matrix instead of eigen-decomposition. Second, we replace the matrix operation with vector operation in the testing process which reduces the computational complexity. Extensive experiments on COIL images dataset, ORL faces dataset, PIE faces dataset, and USPS handwritten digits dataset demonstrate that the proposed algorithm is more efficient than the traditional KDCV algorithm without loss of accuracy.
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-014-0991-9