Loading…

An Optimization Methodology for Neural Network Weights and Architectures

This paper introduces a methodology for neural network global optimization. The aim is the simultaneous optimization of multilayer perceptron (MLP) network weights and architectures, in order to generate topologies with few connections and high classification performance for any data sets. The appro...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2006-11, Vol.17 (6), p.1452-1459
Main Authors: Ludermir, T.B., Yamazaki, A., Zanchettin, C.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper introduces a methodology for neural network global optimization. The aim is the simultaneous optimization of multilayer perceptron (MLP) network weights and architectures, in order to generate topologies with few connections and high classification performance for any data sets. The approach combines the advantages of simulated annealing, tabu search and the backpropagation training algorithm in order to generate an automatic process for producing networks with high classification performance and low complexity. Experimental results obtained with four classification problems and one prediction problem has shown to be better than those obtained by the most commonly used optimization techniques
ISSN:1045-9227
2162-237X
1941-0093
2162-2388
DOI:10.1109/TNN.2006.881047