Loading…
Assembling engineering knowledge in a modular multi-layer perceptron neural network
The popular multilayer perceptron (MLP) topology with an error-backpropagation learning rule doesn't allow the developer to use the (explicit) engineering knowledge as available in real-life problems. Design procedures described in literature start either with a random initialization or with a...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The popular multilayer perceptron (MLP) topology with an error-backpropagation learning rule doesn't allow the developer to use the (explicit) engineering knowledge as available in real-life problems. Design procedures described in literature start either with a random initialization or with a 'smart' initialization of the weight values based on statistical properties of the training data. This article presents a design methodology that enables the insertion of pre-trained parts in a MLP network topology and illustrates the advantages of such a modular approach. Furthermore we will discuss the differences between the modular approach and a hybrid approach, where explicit knowledge is captured by mathematical models. In a hybrid design a mathematical model is embedded in the modular neural network as an optimization of one of the pre-trained subnetworks or because the designer wants to obtain a certain degree of transparency of captured knowledge in the modular design. |
---|---|
DOI: | 10.1109/ICNN.1997.611670 |