Loading…

Advances in evolutionary feature selection neural networks with co-evolution learning

Training neural networks is a complex task provided that many algorithms are combined to find best solutions to the classification problem. In this work, we point out the evolutionary computing to minimize a neural configuration. For this purpose, a distribution estimation framework is performed to...

Full description

Saved in:
Bibliographic Details
Published in:Neural computing & applications 2008-06, Vol.17 (3), p.217-226
Main Author: Mohamed Ben Ali, Yamina
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Training neural networks is a complex task provided that many algorithms are combined to find best solutions to the classification problem. In this work, we point out the evolutionary computing to minimize a neural configuration. For this purpose, a distribution estimation framework is performed to select relevant features, which lead to classification accuracy with a lower complexity in computational time. Primarily, a pruning strategy-based score function is applied to decide the network relevance in the genetic population. Since the complexity of the network (connections, weights, and biases) is most important, the cooling state of the system will strongly relate to the entropy as a minimization function to reach the desired solution. Also, the framework proposes coevolution learning (with discrete and continuous representations) to improve the behavior of the evolutionary neural learning. The results obtained after simulations show that the proposed work is a promising way to extend its usability to other classes of neural networks.
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-007-0114-x