Loading…
Neural minimization methods
To enrich any model and its dynamics introduction of delay is useful, that models a precise description of real-life phenomena. Differential equations in which current time derivatives count on the solution and its derivatives at a prior time are known as delay differential equations (DDEs). In this...
Saved in:
Published in: | PloS one 2019-10, Vol.14 (10), p.e0223476 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | To enrich any model and its dynamics introduction of delay is useful, that models a precise description of real-life phenomena. Differential equations in which current time derivatives count on the solution and its derivatives at a prior time are known as delay differential equations (DDEs). In this study, we are introducing new techniques for finding the numerical solution of fractional delay differential equations (FDDEs) based on the application of neural minimization (NM) by utilizing Chebyshev simulated annealing neural network (ChSANN) and Legendre simulated annealing neural network (LSANN). The main purpose of using Chebyshev and Legendre polynomials, along with simulated annealing (SA), is to reduce mean square error (MSE) that leads to more accurate numerical approximations. This study provides the application of ChSANN and LSANN for solving DDEs and FDDEs. Proposed schemes can be effortlessly executed by using Mathematica or MATLAB software to get explicit solutions. Computational outcomes are depicted, for various numerical experiments, numerically and graphically with error analysis to demonstrate the accuracy and efficiency of the methods. |
---|---|
ISSN: | 1932-6203 1932-6203 |
DOI: | 10.1371/journal.pone.0223476 |