Loading…

Neural network training for complex industrial applications

The paper presents two methods of training multilayer perceptrons (MLPs) that use both functional values and co-located derivative values during the training process. The first method extends the standard backpropagation training algorithm for MLPs whereas the second method employs genetic algorithm...

Full description

Saved in:
Bibliographic Details
Main Authors: VanLandingham, H.F., Azam, F., Pulliam, W.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The paper presents two methods of training multilayer perceptrons (MLPs) that use both functional values and co-located derivative values during the training process. The first method extends the standard backpropagation training algorithm for MLPs whereas the second method employs genetic algorithms (GAs) to find the optimal neural network weights using both functional and co-located function derivative values. The GAs used for optimization of the weights of a feedforward artificial neural network use a special reordering of the genotype before recombination. The ultimate goal of this research effort is to be able to train and design an artificial neural networks (ANN) more effectively, i.e., to have a network that generalizes better, learns faster and requires fewer training data points. The initial results indicate that the methods do, in fact, provide good generalization while requiring only a relatively sparse sampling of the function and its derivative values during the training phase, as indicated by the illustrative examples.
DOI:10.1109/SMCIA.2001.936720