Loading…
Boosting manta rays foraging optimizer by trigonometry operators: a case study on medical dataset
The selection of attributes has become a crucial research focus in the domains of pattern recognition, machine learning, and big data analysis. In essence, the contemporary challenge revolves around reducing dimensionality while maintaining both a quick response time and improved classification perf...
Saved in:
Published in: | Neural computing & applications 2024-06, Vol.36 (16), p.9405-9436 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The selection of attributes has become a crucial research focus in the domains of pattern recognition, machine learning, and big data analysis. In essence, the contemporary challenge revolves around reducing dimensionality while maintaining both a quick response time and improved classification performance. Metaheuristics algorithms (MAs) have emerged as pivotal tools in addressing this issue. Firstly, the problem of attribute selection was approached using the manta ray foraging optimization (MRFO) approach, but the majority of MAs suffer from a problem of convergence toward local minima. To mitigate this challenge, an enhanced variant of MRFO, known as MRFOSCA, employs trigonometric operators inspired by the sine cosine algorithm (SCA) to tackle the feature selection problem. The
k
-nearest neighbor (
k
-NN) technique is employed for feature-set selection. Additionally, the statistical significance of the proposed algorithms is assessed using the nonparametric Wilcoxon’s rank-sum test at a 5% significance level. The outcomes are assessed and compared against some well-known MAs, including the original MRFO and SCA, as well as Harris Hawks optimizer, dragonfly algorithm, grasshopper optimizer algorithm, whale optimizer algorithm, salp swarm algorithm, and grey wolf optimizer. The experimental and comparison analyses validate the pretty effective performance of the proposed methods on low- and high-dimensional datasets by providing the highest accuracy in 85% of the feature selection benchmarks. |
---|---|
ISSN: | 0941-0643 1433-3058 1433-3058 |
DOI: | 10.1007/s00521-024-09565-6 |