Loading…

Adaptive Relevance Matrices in Learning Vector Quantization

We propose a new matrix learning scheme to extend relevance learning vector quantization (RLVQ), an efficient prototype-based classification algorithm, toward a general adaptive metric. By introducing a full matrix of relevance factors in the distance measure, correlations between different features...

Full description

Saved in:
Bibliographic Details
Published in:Neural computation 2009-12, Vol.21 (12), p.3532-3561
Main Authors: Schneider, Petra, Biehl, Michael, Hammer, Barbara
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a new matrix learning scheme to extend relevance learning vector quantization (RLVQ), an efficient prototype-based classification algorithm, toward a general adaptive metric. By introducing a full matrix of relevance factors in the distance measure, correlations between different features and their importance for the classification scheme can be taken into account and automated, and general metric adaptation takes place during training. In comparison to the weighted Euclidean metric used in RLVQ and its variations, a full matrix is more powerful to represent the internal structure of the data appropriately. Large margin generalization bounds can be transferred to this case, leading to bounds that are independent of the input dimensionality. This also holds for local metrics attached to each prototype, which corresponds to piecewise quadratic decision boundaries. The algorithm is tested in comparison to alternative learning vector quantization schemes using an artificial data set, a benchmark multiclass problem from the UCI repository, and a problem from bioinformatics, the recognition of splice sites for .
ISSN:0899-7667
1530-888X
DOI:10.1162/neco.2009.11-08-908