Loading…

Topology and computational performance of attractor neural networks

To explore the relation between network structure and function, we studied the computational performance of Hopfield-type attractor neural nets with regular lattice, random, small-world, and scale-free topologies. The random configuration is the most efficient for storage and retrieval of patterns b...

Full description

Saved in:
Bibliographic Details
Published in:Physical review. E, Statistical, nonlinear, and soft matter physics Statistical, nonlinear, and soft matter physics, 2003-10, Vol.68 (4 Pt 2), p.047102-047102
Main Authors: McGraw, Patrick N, Menzinger, Michael
Format: Article
Language:English
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c435t-8b843428c877748d4877ede6f8e2a18a50309cc8f0fe9af8ff1b3108ced0258d3
cites
container_end_page 047102
container_issue 4 Pt 2
container_start_page 047102
container_title Physical review. E, Statistical, nonlinear, and soft matter physics
container_volume 68
creator McGraw, Patrick N
Menzinger, Michael
description To explore the relation between network structure and function, we studied the computational performance of Hopfield-type attractor neural nets with regular lattice, random, small-world, and scale-free topologies. The random configuration is the most efficient for storage and retrieval of patterns by the network as a whole. However, in the scale-free case retrieval errors are not distributed uniformly among the nodes. The portion of a pattern encoded by the subset of highly connected nodes is more robust and efficiently recognized than the rest of the pattern. The scale-free network thus achieves a very strong partial recognition. The implications of these findings for brain function and social dynamics are suggestive.
doi_str_mv 10.1103/physreve.68.047102
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_proquest_miscellaneous_71515624</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>71515624</sourcerecordid><originalsourceid>FETCH-LOGICAL-c435t-8b843428c877748d4877ede6f8e2a18a50309cc8f0fe9af8ff1b3108ced0258d3</originalsourceid><addsrcrecordid>eNo1j71OwzAURj2AaCm8AAPKxJZg59rJzYgq_qRKLGWOHOcaAkkcbAfUtycSZTrDd3Skj7ErwTMhONxO74fg6ZuyAjMuS8HzE7YWCqoUSqVW7DyED84hB5RnbCVkgcAR1my7d5Pr3dsh0WObGDdMc9Sxc6Puk4m8dX7Qo6HE2UTH6LWJzicjzX7ZR4o_zn-GC3ZqdR_o8sgNe32432-f0t3L4_P2bpcaCSqm2KAEmaPBsiwltnIhtVRYpFwL1IoDr4xByy1V2qK1ogHB0VDLc4UtbNjNX3fy7mumEOuhC4b6Xo_k5lCXQglV5HIRr4_i3AzU1pPvBu0P9f9t-AW-SFrR</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>71515624</pqid></control><display><type>article</type><title>Topology and computational performance of attractor neural networks</title><source>American Physical Society:Jisc Collections:APS Read and Publish 2023-2025 (reading list)</source><creator>McGraw, Patrick N ; Menzinger, Michael</creator><creatorcontrib>McGraw, Patrick N ; Menzinger, Michael</creatorcontrib><description>To explore the relation between network structure and function, we studied the computational performance of Hopfield-type attractor neural nets with regular lattice, random, small-world, and scale-free topologies. The random configuration is the most efficient for storage and retrieval of patterns by the network as a whole. However, in the scale-free case retrieval errors are not distributed uniformly among the nodes. The portion of a pattern encoded by the subset of highly connected nodes is more robust and efficiently recognized than the rest of the pattern. The scale-free network thus achieves a very strong partial recognition. The implications of these findings for brain function and social dynamics are suggestive.</description><identifier>ISSN: 1539-3755</identifier><identifier>DOI: 10.1103/physreve.68.047102</identifier><identifier>PMID: 14683083</identifier><language>eng</language><publisher>United States</publisher><ispartof>Physical review. E, Statistical, nonlinear, and soft matter physics, 2003-10, Vol.68 (4 Pt 2), p.047102-047102</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c435t-8b843428c877748d4877ede6f8e2a18a50309cc8f0fe9af8ff1b3108ced0258d3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/14683083$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>McGraw, Patrick N</creatorcontrib><creatorcontrib>Menzinger, Michael</creatorcontrib><title>Topology and computational performance of attractor neural networks</title><title>Physical review. E, Statistical, nonlinear, and soft matter physics</title><addtitle>Phys Rev E Stat Nonlin Soft Matter Phys</addtitle><description>To explore the relation between network structure and function, we studied the computational performance of Hopfield-type attractor neural nets with regular lattice, random, small-world, and scale-free topologies. The random configuration is the most efficient for storage and retrieval of patterns by the network as a whole. However, in the scale-free case retrieval errors are not distributed uniformly among the nodes. The portion of a pattern encoded by the subset of highly connected nodes is more robust and efficiently recognized than the rest of the pattern. The scale-free network thus achieves a very strong partial recognition. The implications of these findings for brain function and social dynamics are suggestive.</description><issn>1539-3755</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2003</creationdate><recordtype>article</recordtype><recordid>eNo1j71OwzAURj2AaCm8AAPKxJZg59rJzYgq_qRKLGWOHOcaAkkcbAfUtycSZTrDd3Skj7ErwTMhONxO74fg6ZuyAjMuS8HzE7YWCqoUSqVW7DyED84hB5RnbCVkgcAR1my7d5Pr3dsh0WObGDdMc9Sxc6Puk4m8dX7Qo6HE2UTH6LWJzicjzX7ZR4o_zn-GC3ZqdR_o8sgNe32432-f0t3L4_P2bpcaCSqm2KAEmaPBsiwltnIhtVRYpFwL1IoDr4xByy1V2qK1ogHB0VDLc4UtbNjNX3fy7mumEOuhC4b6Xo_k5lCXQglV5HIRr4_i3AzU1pPvBu0P9f9t-AW-SFrR</recordid><startdate>20031001</startdate><enddate>20031001</enddate><creator>McGraw, Patrick N</creator><creator>Menzinger, Michael</creator><scope>NPM</scope><scope>7X8</scope></search><sort><creationdate>20031001</creationdate><title>Topology and computational performance of attractor neural networks</title><author>McGraw, Patrick N ; Menzinger, Michael</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c435t-8b843428c877748d4877ede6f8e2a18a50309cc8f0fe9af8ff1b3108ced0258d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2003</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>McGraw, Patrick N</creatorcontrib><creatorcontrib>Menzinger, Michael</creatorcontrib><collection>PubMed</collection><collection>MEDLINE - Academic</collection><jtitle>Physical review. E, Statistical, nonlinear, and soft matter physics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>McGraw, Patrick N</au><au>Menzinger, Michael</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Topology and computational performance of attractor neural networks</atitle><jtitle>Physical review. E, Statistical, nonlinear, and soft matter physics</jtitle><addtitle>Phys Rev E Stat Nonlin Soft Matter Phys</addtitle><date>2003-10-01</date><risdate>2003</risdate><volume>68</volume><issue>4 Pt 2</issue><spage>047102</spage><epage>047102</epage><pages>047102-047102</pages><issn>1539-3755</issn><abstract>To explore the relation between network structure and function, we studied the computational performance of Hopfield-type attractor neural nets with regular lattice, random, small-world, and scale-free topologies. The random configuration is the most efficient for storage and retrieval of patterns by the network as a whole. However, in the scale-free case retrieval errors are not distributed uniformly among the nodes. The portion of a pattern encoded by the subset of highly connected nodes is more robust and efficiently recognized than the rest of the pattern. The scale-free network thus achieves a very strong partial recognition. The implications of these findings for brain function and social dynamics are suggestive.</abstract><cop>United States</cop><pmid>14683083</pmid><doi>10.1103/physreve.68.047102</doi><tpages>1</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1539-3755
ispartof Physical review. E, Statistical, nonlinear, and soft matter physics, 2003-10, Vol.68 (4 Pt 2), p.047102-047102
issn 1539-3755
language eng
recordid cdi_proquest_miscellaneous_71515624
source American Physical Society:Jisc Collections:APS Read and Publish 2023-2025 (reading list)
title Topology and computational performance of attractor neural networks
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T16%3A37%3A22IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Topology%20and%20computational%20performance%20of%20attractor%20neural%20networks&rft.jtitle=Physical%20review.%20E,%20Statistical,%20nonlinear,%20and%20soft%20matter%20physics&rft.au=McGraw,%20Patrick%20N&rft.date=2003-10-01&rft.volume=68&rft.issue=4%20Pt%202&rft.spage=047102&rft.epage=047102&rft.pages=047102-047102&rft.issn=1539-3755&rft_id=info:doi/10.1103/physreve.68.047102&rft_dat=%3Cproquest_pubme%3E71515624%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c435t-8b843428c877748d4877ede6f8e2a18a50309cc8f0fe9af8ff1b3108ced0258d3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=71515624&rft_id=info:pmid/14683083&rfr_iscdi=true