Loading…
Extrapolation of Mackey-Glass data using Cascade Correlation
Attempting to find near-optimal architec tures, ontogenic neural networks develop their own architectures as they train. As part of a project entitled "Ontogenic Neural Networks for the Prediction of Chaotic Time Series," this paper presents findings of a ten- week research period on using...
Saved in:
Published in: | Simulation (San Diego, Calif.) Calif.), 1992-05, Vol.58 (5), p.333-339 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Attempting to find near-optimal architec tures, ontogenic neural networks develop their own architectures as they train. As part of a project entitled "Ontogenic Neural Networks for the Prediction of Chaotic Time Series," this paper presents findings of a ten- week research period on using the Cascade Correlation ontogenic neural network to extrapolate (predict) a chaotic time series generated from the Mackey-Glass equation. During training the neural network forms a model of the Mackey-Glass equation by observing its behavior. Then the neural network is used to simulate the function in order to extrapolate it, that is, to predict its behavior beyond the space observed by the neural network. Truer, more informative measures of extrapolation accuracy than currently popular measures are presented. The effects of some network parameters on extrapolation accuracy were investigated. Sinusoidal activation functions turned out to be best for our data set. The best range for sigmoidal activation functions was [-1, +1]. Though surprisingly good extrapolations have been obtained, there remain pitfalls. These pitfalls are discussed along with possible methods for avoiding them. |
---|---|
ISSN: | 0037-5497 1741-3133 |
DOI: | 10.1177/003754979205800506 |