Loading…

Universal representation by Boltzmann machines with Regularised Axons

It is widely known that Boltzmann machines are capable of representing arbitrary probability distributions over the values of their visible neurons, given enough hidden ones. However, sampling -- and thus training -- these models can be numerically hard. Recently we proposed a regularisation of the...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2023-11
Main Authors: Grzybowski, Przemysław R, Jankiewicz, Antoni, Piñol, Eloy, Cirauqui, David, Grzybowska, Dorota H, Petrykowski, Paweł M, García-March, Miguel Ángel, Lewenstein, Maciej, Muñoz-Gil, Gorka, Pozas-Kerstjens, Alejandro
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Grzybowski, Przemysław R
Jankiewicz, Antoni
Piñol, Eloy
Cirauqui, David
Grzybowska, Dorota H
Petrykowski, Paweł M
García-March, Miguel Ángel
Lewenstein, Maciej
Muñoz-Gil, Gorka
Pozas-Kerstjens, Alejandro
description It is widely known that Boltzmann machines are capable of representing arbitrary probability distributions over the values of their visible neurons, given enough hidden ones. However, sampling -- and thus training -- these models can be numerically hard. Recently we proposed a regularisation of the connections of Boltzmann machines, in order to control the energy landscape of the model, paving a way for efficient sampling and training. Here we formally prove that such regularised Boltzmann machines preserve the ability to represent arbitrary distributions. This is in conjunction with controlling the number of energy local minima, thus enabling easy \emph{guided} sampling and training. Furthermore, we explicitly show that regularised Boltzmann machines can store exponentially many arbitrarily correlated visible patterns with perfect retrieval, and we connect them to the Dense Associative Memory networks.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2881058222</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2881058222</sourcerecordid><originalsourceid>FETCH-proquest_journals_28810582223</originalsourceid><addsrcrecordid>eNqNyr0KwjAUQOEgCBbtOwScC-mN1awqFWfRuUS92pQ0qbmpf0-vgw_gdIbvDFgCUuaZmgGMWErUCCFgvoCikAkrD87cMZC2PGAXkNBFHY13_PjiK2_ju9XO8VafauOQ-MPEmu_w2lsdDOGZL5_e0YQNL9oSpr-O2XRT7tfbrAv-1iPFqvF9cF-qQKlcFAoA5H_XB_BEO7Y</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2881058222</pqid></control><display><type>article</type><title>Universal representation by Boltzmann machines with Regularised Axons</title><source>ProQuest - Publicly Available Content Database</source><creator>Grzybowski, Przemysław R ; Jankiewicz, Antoni ; Piñol, Eloy ; Cirauqui, David ; Grzybowska, Dorota H ; Petrykowski, Paweł M ; García-March, Miguel Ángel ; Lewenstein, Maciej ; Muñoz-Gil, Gorka ; Pozas-Kerstjens, Alejandro</creator><creatorcontrib>Grzybowski, Przemysław R ; Jankiewicz, Antoni ; Piñol, Eloy ; Cirauqui, David ; Grzybowska, Dorota H ; Petrykowski, Paweł M ; García-March, Miguel Ángel ; Lewenstein, Maciej ; Muñoz-Gil, Gorka ; Pozas-Kerstjens, Alejandro</creatorcontrib><description>It is widely known that Boltzmann machines are capable of representing arbitrary probability distributions over the values of their visible neurons, given enough hidden ones. However, sampling -- and thus training -- these models can be numerically hard. Recently we proposed a regularisation of the connections of Boltzmann machines, in order to control the energy landscape of the model, paving a way for efficient sampling and training. Here we formally prove that such regularised Boltzmann machines preserve the ability to represent arbitrary distributions. This is in conjunction with controlling the number of energy local minima, thus enabling easy \emph{guided} sampling and training. Furthermore, we explicitly show that regularised Boltzmann machines can store exponentially many arbitrarily correlated visible patterns with perfect retrieval, and we connect them to the Dense Associative Memory networks.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Associative memory ; Axons ; Regularization ; Sampling ; Training</subject><ispartof>arXiv.org, 2023-11</ispartof><rights>2023. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2881058222?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25751,37010,44588</link.rule.ids></links><search><creatorcontrib>Grzybowski, Przemysław R</creatorcontrib><creatorcontrib>Jankiewicz, Antoni</creatorcontrib><creatorcontrib>Piñol, Eloy</creatorcontrib><creatorcontrib>Cirauqui, David</creatorcontrib><creatorcontrib>Grzybowska, Dorota H</creatorcontrib><creatorcontrib>Petrykowski, Paweł M</creatorcontrib><creatorcontrib>García-March, Miguel Ángel</creatorcontrib><creatorcontrib>Lewenstein, Maciej</creatorcontrib><creatorcontrib>Muñoz-Gil, Gorka</creatorcontrib><creatorcontrib>Pozas-Kerstjens, Alejandro</creatorcontrib><title>Universal representation by Boltzmann machines with Regularised Axons</title><title>arXiv.org</title><description>It is widely known that Boltzmann machines are capable of representing arbitrary probability distributions over the values of their visible neurons, given enough hidden ones. However, sampling -- and thus training -- these models can be numerically hard. Recently we proposed a regularisation of the connections of Boltzmann machines, in order to control the energy landscape of the model, paving a way for efficient sampling and training. Here we formally prove that such regularised Boltzmann machines preserve the ability to represent arbitrary distributions. This is in conjunction with controlling the number of energy local minima, thus enabling easy \emph{guided} sampling and training. Furthermore, we explicitly show that regularised Boltzmann machines can store exponentially many arbitrarily correlated visible patterns with perfect retrieval, and we connect them to the Dense Associative Memory networks.</description><subject>Associative memory</subject><subject>Axons</subject><subject>Regularization</subject><subject>Sampling</subject><subject>Training</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNyr0KwjAUQOEgCBbtOwScC-mN1awqFWfRuUS92pQ0qbmpf0-vgw_gdIbvDFgCUuaZmgGMWErUCCFgvoCikAkrD87cMZC2PGAXkNBFHY13_PjiK2_ju9XO8VafauOQ-MPEmu_w2lsdDOGZL5_e0YQNL9oSpr-O2XRT7tfbrAv-1iPFqvF9cF-qQKlcFAoA5H_XB_BEO7Y</recordid><startdate>20231130</startdate><enddate>20231130</enddate><creator>Grzybowski, Przemysław R</creator><creator>Jankiewicz, Antoni</creator><creator>Piñol, Eloy</creator><creator>Cirauqui, David</creator><creator>Grzybowska, Dorota H</creator><creator>Petrykowski, Paweł M</creator><creator>García-March, Miguel Ángel</creator><creator>Lewenstein, Maciej</creator><creator>Muñoz-Gil, Gorka</creator><creator>Pozas-Kerstjens, Alejandro</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20231130</creationdate><title>Universal representation by Boltzmann machines with Regularised Axons</title><author>Grzybowski, Przemysław R ; Jankiewicz, Antoni ; Piñol, Eloy ; Cirauqui, David ; Grzybowska, Dorota H ; Petrykowski, Paweł M ; García-March, Miguel Ángel ; Lewenstein, Maciej ; Muñoz-Gil, Gorka ; Pozas-Kerstjens, Alejandro</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28810582223</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Associative memory</topic><topic>Axons</topic><topic>Regularization</topic><topic>Sampling</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Grzybowski, Przemysław R</creatorcontrib><creatorcontrib>Jankiewicz, Antoni</creatorcontrib><creatorcontrib>Piñol, Eloy</creatorcontrib><creatorcontrib>Cirauqui, David</creatorcontrib><creatorcontrib>Grzybowska, Dorota H</creatorcontrib><creatorcontrib>Petrykowski, Paweł M</creatorcontrib><creatorcontrib>García-March, Miguel Ángel</creatorcontrib><creatorcontrib>Lewenstein, Maciej</creatorcontrib><creatorcontrib>Muñoz-Gil, Gorka</creatorcontrib><creatorcontrib>Pozas-Kerstjens, Alejandro</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>ProQuest - Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Grzybowski, Przemysław R</au><au>Jankiewicz, Antoni</au><au>Piñol, Eloy</au><au>Cirauqui, David</au><au>Grzybowska, Dorota H</au><au>Petrykowski, Paweł M</au><au>García-March, Miguel Ángel</au><au>Lewenstein, Maciej</au><au>Muñoz-Gil, Gorka</au><au>Pozas-Kerstjens, Alejandro</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Universal representation by Boltzmann machines with Regularised Axons</atitle><jtitle>arXiv.org</jtitle><date>2023-11-30</date><risdate>2023</risdate><eissn>2331-8422</eissn><abstract>It is widely known that Boltzmann machines are capable of representing arbitrary probability distributions over the values of their visible neurons, given enough hidden ones. However, sampling -- and thus training -- these models can be numerically hard. Recently we proposed a regularisation of the connections of Boltzmann machines, in order to control the energy landscape of the model, paving a way for efficient sampling and training. Here we formally prove that such regularised Boltzmann machines preserve the ability to represent arbitrary distributions. This is in conjunction with controlling the number of energy local minima, thus enabling easy \emph{guided} sampling and training. Furthermore, we explicitly show that regularised Boltzmann machines can store exponentially many arbitrarily correlated visible patterns with perfect retrieval, and we connect them to the Dense Associative Memory networks.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2023-11
issn 2331-8422
language eng
recordid cdi_proquest_journals_2881058222
source ProQuest - Publicly Available Content Database
subjects Associative memory
Axons
Regularization
Sampling
Training
title Universal representation by Boltzmann machines with Regularised Axons
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T14%3A40%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Universal%20representation%20by%20Boltzmann%20machines%20with%20Regularised%20Axons&rft.jtitle=arXiv.org&rft.au=Grzybowski,%20Przemys%C5%82aw%20R&rft.date=2023-11-30&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2881058222%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_28810582223%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2881058222&rft_id=info:pmid/&rfr_iscdi=true