Loading…
Method for the selection of inputs and structure of feedforward neural networks
Feedforward neural networks of multi-layer perceptron type can be used as nonlinear black-box models in data-mining tasks. Common problems encountered are how to select relevant inputs from a large set of variables that potentially affect the outputs to be modeled, as well as high levels of noise in...
Saved in:
Published in: | Computers & chemical engineering 2006-05, Vol.30 (6), p.1038-1045 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Feedforward neural networks of multi-layer perceptron type can be used as nonlinear black-box models in data-mining tasks. Common problems encountered are how to select relevant inputs from a large set of variables that potentially affect the outputs to be modeled, as well as high levels of noise in the data sets. In order to avoid over-fitting of the resulting model, the input dimension and/or the number of hidden nodes have to be restricted. This paper presents a systematic method that can guide the selection of both input variables and a sparse connectivity of the lower layer of connections in feedforward neural networks of multi-layer perceptron type with one layer of hidden nonlinear units and a single linear output node. The algorithm is illustrated on three benchmark problems. |
---|---|
ISSN: | 0098-1354 1873-4375 |
DOI: | 10.1016/j.compchemeng.2006.01.007 |