Loading…

Structural Properties of Recurrent Neural Networks

In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured st...

Full description

Saved in:
Bibliographic Details
Published in:Neural processing letters 2009-04, Vol.29 (2), p.75-88
Main Authors: Dobnikar, Andrej, Šter, Branko
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c346t-6d0b07e45438bec2850ddd67c4f11e326bcf440e91616b82e7073db0da88f19c3
cites cdi_FETCH-LOGICAL-c346t-6d0b07e45438bec2850ddd67c4f11e326bcf440e91616b82e7073db0da88f19c3
container_end_page 88
container_issue 2
container_start_page 75
container_title Neural processing letters
container_volume 29
creator Dobnikar, Andrej
Šter, Branko
description In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured structural properties of the derived graphs, such as characteristic path lengths, clustering coefficients and degree distributions. The results imply that a trained RNN has significantly larger clustering coefficient than a random network with a comparable connectivity. Besides, the degree distributions show existence of nodes with a large degree or hubs, typical for scale-free networks. We also show analytically and experimentally that this type of degree distribution has increased entropy.
doi_str_mv 10.1007/s11063-009-9096-2
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2918339238</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2918339238</sourcerecordid><originalsourceid>FETCH-LOGICAL-c346t-6d0b07e45438bec2850ddd67c4f11e326bcf440e91616b82e7073db0da88f19c3</originalsourceid><addsrcrecordid>eNp1kM1LAzEQxYMoWKt_gLcF8RidSbZJ9ijFLyhV_ABvYTebSGvdrZNdxP_e1C168jQD7_feDI-xY4QzBNDnERGU5AAFL6BQXOywEU605FrLl920Sw08VwL32UGMS4DkEjBi4rGj3nU9lavsntq1p27hY9aG7MG7nsg3XTb3P_Lcd58tvcVDthfKVfRH2zlmz1eXT9MbPru7vp1ezLiTueq4qqEC7fNJLk3lnTATqOtaaZcHRC-FqlzIc_AFKlSVEV6DlnUFdWlMwMLJMTsZctfUfvQ-dnbZ9tSkk1YUaKQshDSJwoFy1MZIPtg1Ld5L-rIIdlONHaqxqRq7qcaK5DndJpfRlatAZeMW8dcoUCiUBhMnBi4mqXn19PfB_-HfcrdyWw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2918339238</pqid></control><display><type>article</type><title>Structural Properties of Recurrent Neural Networks</title><source>Springer Nature</source><creator>Dobnikar, Andrej ; Šter, Branko</creator><creatorcontrib>Dobnikar, Andrej ; Šter, Branko</creatorcontrib><description>In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured structural properties of the derived graphs, such as characteristic path lengths, clustering coefficients and degree distributions. The results imply that a trained RNN has significantly larger clustering coefficient than a random network with a comparable connectivity. Besides, the degree distributions show existence of nodes with a large degree or hubs, typical for scale-free networks. We also show analytically and experimentally that this type of degree distribution has increased entropy.</description><identifier>ISSN: 1370-4621</identifier><identifier>EISSN: 1573-773X</identifier><identifier>DOI: 10.1007/s11063-009-9096-2</identifier><language>eng</language><publisher>Boston: Springer US</publisher><subject>Applied sciences ; Artificial Intelligence ; Clustering ; Combinatorics ; Combinatorics. Ordered structures ; Complex Systems ; Computational Intelligence ; Computer Science ; Computer science; control theory; systems ; Computer systems and distributed systems. User interface ; Connectionism. Neural networks ; Exact sciences and technology ; Graph theory ; Graphs ; Information retrieval. Graph ; Mathematics ; Nodes ; Recurrent neural networks ; Sciences and techniques of general use ; Software ; Theoretical computing</subject><ispartof>Neural processing letters, 2009-04, Vol.29 (2), p.75-88</ispartof><rights>Springer Science+Business Media, LLC. 2009</rights><rights>2009 INIST-CNRS</rights><rights>Springer Science+Business Media, LLC. 2009.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c346t-6d0b07e45438bec2850ddd67c4f11e326bcf440e91616b82e7073db0da88f19c3</citedby><cites>FETCH-LOGICAL-c346t-6d0b07e45438bec2850ddd67c4f11e326bcf440e91616b82e7073db0da88f19c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,777,781,27905,27906</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=21261381$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Dobnikar, Andrej</creatorcontrib><creatorcontrib>Šter, Branko</creatorcontrib><title>Structural Properties of Recurrent Neural Networks</title><title>Neural processing letters</title><addtitle>Neural Process Lett</addtitle><description>In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured structural properties of the derived graphs, such as characteristic path lengths, clustering coefficients and degree distributions. The results imply that a trained RNN has significantly larger clustering coefficient than a random network with a comparable connectivity. Besides, the degree distributions show existence of nodes with a large degree or hubs, typical for scale-free networks. We also show analytically and experimentally that this type of degree distribution has increased entropy.</description><subject>Applied sciences</subject><subject>Artificial Intelligence</subject><subject>Clustering</subject><subject>Combinatorics</subject><subject>Combinatorics. Ordered structures</subject><subject>Complex Systems</subject><subject>Computational Intelligence</subject><subject>Computer Science</subject><subject>Computer science; control theory; systems</subject><subject>Computer systems and distributed systems. User interface</subject><subject>Connectionism. Neural networks</subject><subject>Exact sciences and technology</subject><subject>Graph theory</subject><subject>Graphs</subject><subject>Information retrieval. Graph</subject><subject>Mathematics</subject><subject>Nodes</subject><subject>Recurrent neural networks</subject><subject>Sciences and techniques of general use</subject><subject>Software</subject><subject>Theoretical computing</subject><issn>1370-4621</issn><issn>1573-773X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2009</creationdate><recordtype>article</recordtype><recordid>eNp1kM1LAzEQxYMoWKt_gLcF8RidSbZJ9ijFLyhV_ABvYTebSGvdrZNdxP_e1C168jQD7_feDI-xY4QzBNDnERGU5AAFL6BQXOywEU605FrLl920Sw08VwL32UGMS4DkEjBi4rGj3nU9lavsntq1p27hY9aG7MG7nsg3XTb3P_Lcd58tvcVDthfKVfRH2zlmz1eXT9MbPru7vp1ezLiTueq4qqEC7fNJLk3lnTATqOtaaZcHRC-FqlzIc_AFKlSVEV6DlnUFdWlMwMLJMTsZctfUfvQ-dnbZ9tSkk1YUaKQshDSJwoFy1MZIPtg1Ld5L-rIIdlONHaqxqRq7qcaK5DndJpfRlatAZeMW8dcoUCiUBhMnBi4mqXn19PfB_-HfcrdyWw</recordid><startdate>20090401</startdate><enddate>20090401</enddate><creator>Dobnikar, Andrej</creator><creator>Šter, Branko</creator><general>Springer US</general><general>Springer</general><general>Springer Nature B.V</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope></search><sort><creationdate>20090401</creationdate><title>Structural Properties of Recurrent Neural Networks</title><author>Dobnikar, Andrej ; Šter, Branko</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c346t-6d0b07e45438bec2850ddd67c4f11e326bcf440e91616b82e7073db0da88f19c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2009</creationdate><topic>Applied sciences</topic><topic>Artificial Intelligence</topic><topic>Clustering</topic><topic>Combinatorics</topic><topic>Combinatorics. Ordered structures</topic><topic>Complex Systems</topic><topic>Computational Intelligence</topic><topic>Computer Science</topic><topic>Computer science; control theory; systems</topic><topic>Computer systems and distributed systems. User interface</topic><topic>Connectionism. Neural networks</topic><topic>Exact sciences and technology</topic><topic>Graph theory</topic><topic>Graphs</topic><topic>Information retrieval. Graph</topic><topic>Mathematics</topic><topic>Nodes</topic><topic>Recurrent neural networks</topic><topic>Sciences and techniques of general use</topic><topic>Software</topic><topic>Theoretical computing</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dobnikar, Andrej</creatorcontrib><creatorcontrib>Šter, Branko</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Database‎ (1962 - current)</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><jtitle>Neural processing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dobnikar, Andrej</au><au>Šter, Branko</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Structural Properties of Recurrent Neural Networks</atitle><jtitle>Neural processing letters</jtitle><stitle>Neural Process Lett</stitle><date>2009-04-01</date><risdate>2009</risdate><volume>29</volume><issue>2</issue><spage>75</spage><epage>88</epage><pages>75-88</pages><issn>1370-4621</issn><eissn>1573-773X</eissn><abstract>In this article we research the impact of the adaptive learning process of recurrent neural networks (RNN) on the structural properties of the derived graphs. A trained fully connected RNN can be converted to a graph by defining edges between pairs od nodes having significant weights. We measured structural properties of the derived graphs, such as characteristic path lengths, clustering coefficients and degree distributions. The results imply that a trained RNN has significantly larger clustering coefficient than a random network with a comparable connectivity. Besides, the degree distributions show existence of nodes with a large degree or hubs, typical for scale-free networks. We also show analytically and experimentally that this type of degree distribution has increased entropy.</abstract><cop>Boston</cop><pub>Springer US</pub><doi>10.1007/s11063-009-9096-2</doi><tpages>14</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1370-4621
ispartof Neural processing letters, 2009-04, Vol.29 (2), p.75-88
issn 1370-4621
1573-773X
language eng
recordid cdi_proquest_journals_2918339238
source Springer Nature
subjects Applied sciences
Artificial Intelligence
Clustering
Combinatorics
Combinatorics. Ordered structures
Complex Systems
Computational Intelligence
Computer Science
Computer science
control theory
systems
Computer systems and distributed systems. User interface
Connectionism. Neural networks
Exact sciences and technology
Graph theory
Graphs
Information retrieval. Graph
Mathematics
Nodes
Recurrent neural networks
Sciences and techniques of general use
Software
Theoretical computing
title Structural Properties of Recurrent Neural Networks
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T05%3A30%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Structural%20Properties%20of%20Recurrent%20Neural%20Networks&rft.jtitle=Neural%20processing%20letters&rft.au=Dobnikar,%20Andrej&rft.date=2009-04-01&rft.volume=29&rft.issue=2&rft.spage=75&rft.epage=88&rft.pages=75-88&rft.issn=1370-4621&rft.eissn=1573-773X&rft_id=info:doi/10.1007/s11063-009-9096-2&rft_dat=%3Cproquest_cross%3E2918339238%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c346t-6d0b07e45438bec2850ddd67c4f11e326bcf440e91616b82e7073db0da88f19c3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2918339238&rft_id=info:pmid/&rfr_iscdi=true