Loading…

On Model of Recurrent Neural Network on a Time Scale: Exponential Convergence and Stability Research

The majority of the results on modeling recurrent neural networks (RNNs) are obtained using delayed differential equations, which imply continuous time representation. On the other hand, these models must be discrete in time, given their practical implementation in computer systems, requiring their...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2024-03, Vol.PP, p.1-15
Main Authors: Martsenyuk, Vasyl, Bernas, Marcin, Klos-Witkowska, Aleksandra
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The majority of the results on modeling recurrent neural networks (RNNs) are obtained using delayed differential equations, which imply continuous time representation. On the other hand, these models must be discrete in time, given their practical implementation in computer systems, requiring their versatile utilization across arbitrary time scales. Hence, the goal of this research is to model and investigate the architecture design of a delayed RNN using delayed differential equations on a time scale. Internal memory can be utilized to describe the calculation of the future states using discrete and distributed delays, which is a representation of the deep learning architecture for artificial RNNs. We focus on qualitative behavior and stability study of the system. Special attention is paid to taking into account the effect of the time-scale parameters on neural network dynamics. Here, we delve into the exploration of exponential stability in RNN models on a time scale that incorporates multiple discrete and distributed delays. Two approaches for constructing exponential estimates, including the Hilger and the usual exponential functions, are considered and compared. The Lyapunov-Krasovskii (L-K) functional method is employed to study stability on a time scale in both cases. The established stability criteria, resulting in an exponential-like estimate, utilizes a tuple of positive definite matrices, decay rate, and graininess of the time scale. The models of RNNs for the two-neuron network with four discrete and distributed delays, as well as the ring lattice delayed network of seven identical neurons, are numerically investigated. The results indicate how the time scale (graininess) and model characteristics (weights) influence the qualitative behavior, leading to a transition from stable focus to quasiperiodic limit cycles.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2024.3377446