Loading…
An Optimal Random Projection k Nearest Neighbors Ensemble via Extended Neighborhood Rule for Binary Classification
This paper presents an ensemble method for binary classification, where each base model is based on an extended neighbourhood rule (ExNRule). The ExNRule identifies the neighbours of an unseen observation in a stepwise manner. This rule first selects the sample point closest to the experimental obse...
Saved in:
Published in: | IEEE access 2024, Vol.12, p.61401-61409 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper presents an ensemble method for binary classification, where each base model is based on an extended neighbourhood rule (ExNRule). The ExNRule identifies the neighbours of an unseen observation in a stepwise manner. This rule first selects the sample point closest to the experimental observation and then selects the second observation nearest to the previously chosen one. To find the required data points in the neighbourhood, this search is repeated up to k steps. The test sample point is predicted using majority voting in the class labels of the k chosen neighbours. In the proposed method, a large number of ExNRule based models are constructed on randomly projected bootstrap samples. The error rates of these models are computed using out-of-bag data points. The models are then ranked according to their out-of-bag errors, and a proportion of the most accurate models are selected. The final ensemble is constructed by combining the selected models. The proposed method is compared with other classical procedures on 15 benchmark datasets in terms of classification accuracy, Kohen's kappa and Brier score (BS) as performance metrics. Boxplots of the results are also constructed. The proposed ensemble is outperforming the existing methods on almost all the benchmark datasets. For further evaluation, the proposed method is compared with other kNN based classifiers on 3 datasets using different k values. Furthermore, the performance of the proposed method is also evaluated using simulated data under different scenarios. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3392729 |