Loading…
2-Stage instance selection algorithm for KNN based on Nearest Unlike Neighbors
For the virtues such as simplicity, high generalization capability, and few training cost, the K-Nearest-Neighbor (KNN) classifier is widely used in pattern recognition and machine learning. However, the computation complexity of KNN classifier will become higher when dealing with large data sets cl...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | For the virtues such as simplicity, high generalization capability, and few training cost, the K-Nearest-Neighbor (KNN) classifier is widely used in pattern recognition and machine learning. However, the computation complexity of KNN classifier will become higher when dealing with large data sets classification problem. In consequence, its efficiency will be decreased greatly. This paper proposes a general two-stage training set condensing algorithm for general KNN classifier. First, we identify the noise data points and remove them from the original training set. Second, a general condensed nearest neighbor rule based on the so-called Nearest Unlike Neighbor (NUN) is presented to further eliminate the redundant samples in training set. In order to verify the performance of the proposed method, some numerical experiments are conducted on several UCI benchmark databases. |
---|---|
ISSN: | 2160-133X |
DOI: | 10.1109/ICMLC.2010.5581078 |