Loading…

The effect of points dispersion on the k-nn search in random projection forests

Partitioning trees are efficient data structures for k -nearest neighbor search. Machine learning libraries commonly use a special type of partitioning trees called k d-trees to perform k -nn search. Unfortunately, k d-trees can be ineffective in high dimensions because they need more tree levels to...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2022, Vol.10, p.1-1
Main Authors: Alshammari, Mashaan, Stavrakakis, John, Ahmed, Adel F., Takatsuka, Masahiro
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Partitioning trees are efficient data structures for k -nearest neighbor search. Machine learning libraries commonly use a special type of partitioning trees called k d-trees to perform k -nn search. Unfortunately, k d-trees can be ineffective in high dimensions because they need more tree levels to decrease the vector quantization (VQ) error. Random projection trees rpTrees solve this scalability problem by using random directions to split the data. A collection of rpTrees is called rpForest. k -nn search in an rpForest is influenced by two factors: 1) the dispersion of points along the random direction and 2) the number of rpTrees in the rpForest. In this study, we investigate how these two factors affect the k -nn search with varying k values and different datasets. We found that with larger number of trees, the dispersion of points has a very limited effect on the k -nn search. One should use the original rpTree algorithm by picking a random direction regardless of the dispersion of points.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3195488