Loading…
Divergent Projection Analysis for Unsupervised Dimensionality Reduction
Principal component analysis (PCA) has been extended to a series of classical methods in dimensionality reduction for unsu¬pervised learning. However, the differences among the projected data samples may not be sufficient by PCA, causing problems both in data visualization and recognition. Though so...
Saved in:
Published in: | Procedia computer science 2022-01, Vol.199, p.384-391 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Principal component analysis (PCA) has been extended to a series of classical methods in dimensionality reduction for unsu¬pervised learning. However, the differences among the projected data samples may not be sufficient by PCA, causing problems both in data visualization and recognition. Though some nonlinear dimensionality reduction methods have been proposed to preserve the samples’ differences, their nonlinear mappings hide the relationships among the data features, e.g., Isometric mapping (Isomap) and Locally Linear Embedding (LLE). In this paper, we propose a linear dimensionality reduction method, called Divergent Projection Analysis (DPA), to preserve the largest differences among the samples. Our DPA projects the sam¬ples into a low-dimensional space such that any two projected samples have a certain distance, resulting in solving a nonconvex optimization problem. The nonconvex problem is solved by the subgradient descent algorithm and particle swarm optimiza¬tion algorithm, and their solution qualities are compared in the experiments. Experiments on a synthetic dataset and three face recognition problems confirm the effectiveness of the proposed method compared with other state-of-the-art methods. |
---|---|
ISSN: | 1877-0509 1877-0509 |
DOI: | 10.1016/j.procs.2022.01.047 |