Loading…
Dimensionality reduction via the Johnson–Lindenstrauss Lemma: theoretical and empirical bounds on embedding dimension
The Johnson–Lindenstrauss (JL) lemma has led to the development of tools for dealing with datasets in high dimensions. The lemma asserts that a set of high-dimensional points can be projected into lower dimensions, while approximately preserving the pairwise distance structure. Significant improveme...
Saved in:
Published in: | The Journal of supercomputing 2018-08, Vol.74 (8), p.3933-3949 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The Johnson–Lindenstrauss (JL) lemma has led to the development of tools for dealing with datasets in high dimensions. The lemma asserts that a set of high-dimensional points can be projected into lower dimensions, while approximately preserving the pairwise distance structure. Significant improvements of the JL lemma since its inception are summarized. Particular focus is placed on reproving Matoušek’s versions of the lemma (Random Struct Algorithms 33(2):142–156,
2008
) first using subgaussian projection coefficients and then using sparse projection coefficients. The results of the lemma are illustrated using simulated data. The simulation suggests a projection that is more effective in terms of dimensionality reduction than is borne out by the theory. This more effective projection was applied to a very large natural, rather than simulated, dataset thus further strengthening empirical evidence of the existence of a better than the proven optimal lower bound on the embedding dimension. Additionally, we provide comparisons with other commonly used data reduction and simplification techniques. |
---|---|
ISSN: | 0920-8542 1573-0484 |
DOI: | 10.1007/s11227-018-2401-y |