Loading…
Structural Properties of Recurrent Neural Networks
In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured st...
Saved in:
Published in: | Neural processing letters 2009-04, Vol.29 (2), p.75-88 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured structural properties of the derived graphs, such as characteristic path lengths, clustering coefficients and degree distributions. The results imply that a trained RNN has significantly larger clustering coefficient than a random network with a comparable connectivity. Besides, the degree distributions show existence of nodes with a large degree or hubs, typical for scale-free networks. We also show analytically and experimentally that this type of degree distribution has increased entropy. |
---|---|
ISSN: | 1370-4621 1573-773X |
DOI: | 10.1007/s11063-009-9096-2 |