Loading…
Fuzzy neural networks stability in terms of the number of hidden layers
This paper introduces an approach for studying the stability, and generalization capability of one and two hidden layer Fuzzy Flip-Flop based Neural Networks (FNNs) with various fuzzy operators. By employing fuzzy flip-flop neurons as sigmoid function generators, novel function approximators are est...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper introduces an approach for studying the stability, and generalization capability of one and two hidden layer Fuzzy Flip-Flop based Neural Networks (FNNs) with various fuzzy operators. By employing fuzzy flip-flop neurons as sigmoid function generators, novel function approximators are established that also avoid overfitting in the case of test data containing noisy items in the form of outliers. It is shown, by comparing with existing standard tansig function based approaches that reducing the network complexity networks with comparable stability are obtained. Finally, examples are given to illustrate the effect of the hidden layer number of neural networks. |
---|---|
DOI: | 10.1109/CINTI.2011.6108523 |