Loading…

Fortified Cuckoo Search Algorithm on training multi-layer perceptron for solving classification problems

Multi-layer perceptron (MLP) in artificial neural networks (ANN) is one among the trained neural models which can hold several layers as a hidden layer for intensive training to obtain optimal results. On the other hand, the classification problem has a high level of attraction towards researchers t...

Full description

Saved in:
Bibliographic Details
Published in:Personal and ubiquitous computing 2023-06, Vol.27 (3), p.1039-1049
Main Authors: Thirugnanasambandam, Kalaipriyan, Prabu, U., Saravanan, D., Anguraj, Dinesh Kumar, Raghav, R.S.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multi-layer perceptron (MLP) in artificial neural networks (ANN) is one among the trained neural models which can hold several layers as a hidden layer for intensive training to obtain optimal results. On the other hand, the classification problem has a high level of attraction towards researchers to increase the accuracy in classification. In ANN, feedforward neural network (FNN) is one model that possesses the art of solving classification and regression problems. When input data is given to FNN, it will apply the sum of product rule and the activation function to map the input with its appropriate output. In the sum of product rule, a term called weights is to be chosen appropriately to map between the input and output. In standard FNN, the weights are chosen in a random way which may lead to slower convergence towards the optimal choice of weight values. In this paper, an effective optimization model is proposed to optimize the weights of MLP of FNN for effective classification problems. Four different datasets were chosen, and the results are interpreted with statistical performance measures.
ISSN:1617-4909
1617-4917
DOI:10.1007/s00779-023-01716-1