Loading…

Improved Harris Hawks Optimization Using Elite Opposition-Based Learning and Novel Search Mechanism for Feature Selection

The rapid increase in data volume and features dimensionality have a negative influence on machine learning and many other fields, such as decreasing classification accuracy and increasing computational cost. Feature selection technique has a critical role as a preprocessing step in reducing these i...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2020, Vol.8, p.121127-121145
Main Authors: Sihwail, Rami, Omar, Khairuddin, Ariffin, Khairul Akram Zainol, Tubishat, Mohammad
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The rapid increase in data volume and features dimensionality have a negative influence on machine learning and many other fields, such as decreasing classification accuracy and increasing computational cost. Feature selection technique has a critical role as a preprocessing step in reducing these issues. It works by eliminating the features that may negatively influence the classifiers' performance, such as irrelevant, redundant and less informative features. This paper aims to introduce an improved Harris hawks optimization (IHHO) by utilizing elite opposite-based learning and proposing a new search mechanism. Harris hawks optimization (HHO) is a novel metaheuristic general-purpose algorithm recently introduced to solve continuous search problems. Compared to conventional HHO, the proposed IHHO can avoid trapping in local optima and has an enhanced search mechanism, relying on mutation, mutation neighborhood search, and rollback strategies to raise the search capabilities. Moreover, it improves population diversity, computational accuracy, and accelerates convergence rate. To evaluate the performance of IHHO, we conducted a series of experiments on twenty benchmark datasets collected from the UCI repository and the scikit-feature project. The datasets represent different levels of feature dimensionality, such as low, moderate, and high. Further, four criteria were adopted to determine the superiority of IHHO: classification accuracy, fitness value, number of selected features, and statistical tests. Furthermore, a comparison between IHHO and other well-known algorithms such as Generic algorithm (GA), Grasshopper Optimization Algorithm (GOA), Particle Swarm Optimization (PSO), Ant Lion Optimizer (ALO), Whale Optimization Algorithm (WOA), Butterfly Optimization Algorithm (BOA) and Slime Mould Algorithm (SMA) was performed. The experimental results have confirmed the dominance of IHHO over the other optimization algorithms in different aspects, such as accuracy, fitness value, and feature selection.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3006473