Loading…

Developing Novel Activation Functions Based Deep Learning LSTM for Classification

This study proposes novel Long Short-Term Memory (LSTM)-based classifiers through developing the internal structure of LSTM neural networks using 26 state activation functions as alternatives to the traditional hyperbolic tangent (tanh) activation function. The LSTM networks have high performance in...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2022, Vol.10, p.97259-97275
Main Authors: Essai Ali, Mohamed H., Abdel-Raman, Adel B., Badry, Eman A.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study proposes novel Long Short-Term Memory (LSTM)-based classifiers through developing the internal structure of LSTM neural networks using 26 state activation functions as alternatives to the traditional hyperbolic tangent (tanh) activation function. The LSTM networks have high performance in solving the vanishing gradient problem that is observed in recurrent neural networks. Performance investigations were carried out utilizing three distinct deep learning optimization algorithms to evaluate the efficiency of the proposed state activation functions-based LSTM classifiers for two different classification tasks. The simulation results demonstrate that the proposed classifiers that use the Modified Elliott, Softsign, Sech, Gaussian, Bitanh1, Bitanh2 and Wave as state activation functions trump the tanh-based LSTM classifiers in terms of classification accuracy. The proposed classifiers are encouraged to be utilized and tested for other classification tasks.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3205774