Loading…
Convergence analysis of Riemann‐Liouville fractional neural network
Many fractional order calculus researchers believe that fractional order calculus is a good way to solve information processing as well as certain physical system modeling problems. In the training of neural networks, there is the problem of long convergence time. In order to shorten the convergence...
Saved in:
Published in: | Mathematical methods in the applied sciences 2022-07, Vol.45 (10), p.6378-6390 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Many fractional order calculus researchers believe that fractional order calculus is a good way to solve information processing as well as certain physical system modeling problems. In the training of neural networks, there is the problem of long convergence time. In order to shorten the convergence time of the network, an R‐L gradient descent method is proposed in this study. The article begins with a theoretical proof of the convergence of fractional order derivatives using function approximation and interpolation inequality theorems. Finally, through multiple simulations, it can be obtained that the fractional‐order neural network can maintain a higher accuracy rate compared with the integer‐order neural network, and also can well solve the problem of longer convergence time of the neural network. The convergence time can be reduced by nearly 10% compared to the integer order. |
---|---|
ISSN: | 0170-4214 1099-1476 |
DOI: | 10.1002/mma.8175 |