Loading…

Information space dynamics for neural networks

We propose a coupled map lattice defined on a hypercube in M dimensions, the information space, to model memory retrieval by a neural network. We consider that both neuronal activity and the spiking phase may carry information. In this model the state of the network at a given time t is completely d...

Full description

Saved in:
Bibliographic Details
Published in:Physical review. E, Statistical, nonlinear, and soft matter physics Statistical, nonlinear, and soft matter physics, 2002-06, Vol.65 (6 Pt 1), p.061908-061908
Main Authors: de Almeida, R M C, Idiart, M A P
Format: Article
Language:English
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c376t-b7b07832404dbb365290def6208b2bb42b10b4cb6f445875c9cd305e87232a7b3
cites
container_end_page 061908
container_issue 6 Pt 1
container_start_page 061908
container_title Physical review. E, Statistical, nonlinear, and soft matter physics
container_volume 65
creator de Almeida, R M C
Idiart, M A P
description We propose a coupled map lattice defined on a hypercube in M dimensions, the information space, to model memory retrieval by a neural network. We consider that both neuronal activity and the spiking phase may carry information. In this model the state of the network at a given time t is completely determined by a function y(sigma-->,t) of the bit strings sigma-->=(sigma1,sigma2,...,sigmaM), where sigma(i)=+/-1 with i=1,2, ...,M, that gives the intensity with which the information sigma--> is being expressed by the network. As an example, we consider logistic maps, coupled in the information space, to describe the evolution of the intensity function y(sigma-->,t). We propose an interpretation of the maps in terms of the physiological state of the neurons and the coupling between them, obtain Hebb-like learning rules, show that the model works as an associative memory, numerically investigate the capacity of the network and the size of the basins of attraction, and estimate finite size effects. We finally show that the model, when exposed to sequences of uncorrelated stimuli, shows recency and latency effects that depend on the noise level, delay time of measurement, and stimulus intensity.
doi_str_mv 10.1103/PhysRevE.65.061908
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_proquest_miscellaneous_72018154</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>72018154</sourcerecordid><originalsourceid>FETCH-LOGICAL-c376t-b7b07832404dbb365290def6208b2bb42b10b4cb6f445875c9cd305e87232a7b3</originalsourceid><addsrcrecordid>eNo1j81KxDAYRbNQnHH0BVxIV-5av_ynSxlGHRhQRNclX5pitX8mrdK3t-C4Oot7OHAJuaKQUQr89vl9ji_-e5cpmYGiOZgTsqaS5ynXUq7IeYwfAJxxI87IijJqjFawJtm-q_rQ2rHuuyQO1vmknDvb1i4my5B0fgq2WTD-9OEzXpDTyjbRXx65IW_3u9ftY3p4ethv7w6p41qNKWoEbTgTIEpEriTLofSVYmCQIQqGFFA4VJUQ0mjpcldykN5oxpnVyDfk5q87hP5r8nEs2jo63zS28_0UC82AGirFIl4fxQlbXxZDqFsb5uL_If8FWRVRcw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>72018154</pqid></control><display><type>article</type><title>Information space dynamics for neural networks</title><source>American Physical Society:Jisc Collections:APS Read and Publish 2023-2025 (reading list)</source><creator>de Almeida, R M C ; Idiart, M A P</creator><creatorcontrib>de Almeida, R M C ; Idiart, M A P</creatorcontrib><description>We propose a coupled map lattice defined on a hypercube in M dimensions, the information space, to model memory retrieval by a neural network. We consider that both neuronal activity and the spiking phase may carry information. In this model the state of the network at a given time t is completely determined by a function y(sigma--&gt;,t) of the bit strings sigma--&gt;=(sigma1,sigma2,...,sigmaM), where sigma(i)=+/-1 with i=1,2, ...,M, that gives the intensity with which the information sigma--&gt; is being expressed by the network. As an example, we consider logistic maps, coupled in the information space, to describe the evolution of the intensity function y(sigma--&gt;,t). We propose an interpretation of the maps in terms of the physiological state of the neurons and the coupling between them, obtain Hebb-like learning rules, show that the model works as an associative memory, numerically investigate the capacity of the network and the size of the basins of attraction, and estimate finite size effects. We finally show that the model, when exposed to sequences of uncorrelated stimuli, shows recency and latency effects that depend on the noise level, delay time of measurement, and stimulus intensity.</description><identifier>ISSN: 1539-3755</identifier><identifier>DOI: 10.1103/PhysRevE.65.061908</identifier><identifier>PMID: 12188760</identifier><language>eng</language><publisher>United States</publisher><ispartof>Physical review. E, Statistical, nonlinear, and soft matter physics, 2002-06, Vol.65 (6 Pt 1), p.061908-061908</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c376t-b7b07832404dbb365290def6208b2bb42b10b4cb6f445875c9cd305e87232a7b3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/12188760$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>de Almeida, R M C</creatorcontrib><creatorcontrib>Idiart, M A P</creatorcontrib><title>Information space dynamics for neural networks</title><title>Physical review. E, Statistical, nonlinear, and soft matter physics</title><addtitle>Phys Rev E Stat Nonlin Soft Matter Phys</addtitle><description>We propose a coupled map lattice defined on a hypercube in M dimensions, the information space, to model memory retrieval by a neural network. We consider that both neuronal activity and the spiking phase may carry information. In this model the state of the network at a given time t is completely determined by a function y(sigma--&gt;,t) of the bit strings sigma--&gt;=(sigma1,sigma2,...,sigmaM), where sigma(i)=+/-1 with i=1,2, ...,M, that gives the intensity with which the information sigma--&gt; is being expressed by the network. As an example, we consider logistic maps, coupled in the information space, to describe the evolution of the intensity function y(sigma--&gt;,t). We propose an interpretation of the maps in terms of the physiological state of the neurons and the coupling between them, obtain Hebb-like learning rules, show that the model works as an associative memory, numerically investigate the capacity of the network and the size of the basins of attraction, and estimate finite size effects. We finally show that the model, when exposed to sequences of uncorrelated stimuli, shows recency and latency effects that depend on the noise level, delay time of measurement, and stimulus intensity.</description><issn>1539-3755</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2002</creationdate><recordtype>article</recordtype><recordid>eNo1j81KxDAYRbNQnHH0BVxIV-5av_ynSxlGHRhQRNclX5pitX8mrdK3t-C4Oot7OHAJuaKQUQr89vl9ji_-e5cpmYGiOZgTsqaS5ynXUq7IeYwfAJxxI87IijJqjFawJtm-q_rQ2rHuuyQO1vmknDvb1i4my5B0fgq2WTD-9OEzXpDTyjbRXx65IW_3u9ftY3p4ethv7w6p41qNKWoEbTgTIEpEriTLofSVYmCQIQqGFFA4VJUQ0mjpcldykN5oxpnVyDfk5q87hP5r8nEs2jo63zS28_0UC82AGirFIl4fxQlbXxZDqFsb5uL_If8FWRVRcw</recordid><startdate>20020601</startdate><enddate>20020601</enddate><creator>de Almeida, R M C</creator><creator>Idiart, M A P</creator><scope>NPM</scope><scope>7X8</scope></search><sort><creationdate>20020601</creationdate><title>Information space dynamics for neural networks</title><author>de Almeida, R M C ; Idiart, M A P</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c376t-b7b07832404dbb365290def6208b2bb42b10b4cb6f445875c9cd305e87232a7b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2002</creationdate><toplevel>online_resources</toplevel><creatorcontrib>de Almeida, R M C</creatorcontrib><creatorcontrib>Idiart, M A P</creatorcontrib><collection>PubMed</collection><collection>MEDLINE - Academic</collection><jtitle>Physical review. E, Statistical, nonlinear, and soft matter physics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>de Almeida, R M C</au><au>Idiart, M A P</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Information space dynamics for neural networks</atitle><jtitle>Physical review. E, Statistical, nonlinear, and soft matter physics</jtitle><addtitle>Phys Rev E Stat Nonlin Soft Matter Phys</addtitle><date>2002-06-01</date><risdate>2002</risdate><volume>65</volume><issue>6 Pt 1</issue><spage>061908</spage><epage>061908</epage><pages>061908-061908</pages><issn>1539-3755</issn><abstract>We propose a coupled map lattice defined on a hypercube in M dimensions, the information space, to model memory retrieval by a neural network. We consider that both neuronal activity and the spiking phase may carry information. In this model the state of the network at a given time t is completely determined by a function y(sigma--&gt;,t) of the bit strings sigma--&gt;=(sigma1,sigma2,...,sigmaM), where sigma(i)=+/-1 with i=1,2, ...,M, that gives the intensity with which the information sigma--&gt; is being expressed by the network. As an example, we consider logistic maps, coupled in the information space, to describe the evolution of the intensity function y(sigma--&gt;,t). We propose an interpretation of the maps in terms of the physiological state of the neurons and the coupling between them, obtain Hebb-like learning rules, show that the model works as an associative memory, numerically investigate the capacity of the network and the size of the basins of attraction, and estimate finite size effects. We finally show that the model, when exposed to sequences of uncorrelated stimuli, shows recency and latency effects that depend on the noise level, delay time of measurement, and stimulus intensity.</abstract><cop>United States</cop><pmid>12188760</pmid><doi>10.1103/PhysRevE.65.061908</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1539-3755
ispartof Physical review. E, Statistical, nonlinear, and soft matter physics, 2002-06, Vol.65 (6 Pt 1), p.061908-061908
issn 1539-3755
language eng
recordid cdi_proquest_miscellaneous_72018154
source American Physical Society:Jisc Collections:APS Read and Publish 2023-2025 (reading list)
title Information space dynamics for neural networks
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T23%3A01%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Information%20space%20dynamics%20for%20neural%20networks&rft.jtitle=Physical%20review.%20E,%20Statistical,%20nonlinear,%20and%20soft%20matter%20physics&rft.au=de%20Almeida,%20R%20M%20C&rft.date=2002-06-01&rft.volume=65&rft.issue=6%20Pt%201&rft.spage=061908&rft.epage=061908&rft.pages=061908-061908&rft.issn=1539-3755&rft_id=info:doi/10.1103/PhysRevE.65.061908&rft_dat=%3Cproquest_pubme%3E72018154%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c376t-b7b07832404dbb365290def6208b2bb42b10b4cb6f445875c9cd305e87232a7b3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=72018154&rft_id=info:pmid/12188760&rfr_iscdi=true