Loading…

Dimensionality Reduction by Minimal Distance Maximization

In this paper, we propose a novel discriminant analysis method, called Minimal Distance Maximization (MDM). In contrast to the traditional LDA, which actually maximizes the average divergence among classes, MDM attempts to find a low-dimensional subspace that maximizes the minimal (worst-case) diver...

Full description

Saved in:
Bibliographic Details
Main Authors: Bo Xu, Kaizhu Huang, Cheng-Lin Liu
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose a novel discriminant analysis method, called Minimal Distance Maximization (MDM). In contrast to the traditional LDA, which actually maximizes the average divergence among classes, MDM attempts to find a low-dimensional subspace that maximizes the minimal (worst-case) divergence among classes. This ``minimal" setting solves the problem caused by the ``average" setting of LDA that tends to merge similar classes with smaller divergence when used for multi-class data. Furthermore, we elegantly formulate the worst-case problem as a convex problem, making the algorithm solvable for larger data sets. Experimental results demonstrate the advantages of our proposed method against five other competitive approaches on one synthetic and six real-life data sets.
ISSN:1051-4651
2831-7475
DOI:10.1109/ICPR.2010.144