Loading…

Universal representation by Boltzmann machines with Regularised Axons

It is widely known that Boltzmann machines are capable of representing arbitrary probability distributions over the values of their visible neurons, given enough hidden ones. However, sampling -- and thus training -- these models can be numerically hard. Recently we proposed a regularisation of the...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2023-11
Main Authors: Grzybowski, Przemysław R, Jankiewicz, Antoni, Piñol, Eloy, Cirauqui, David, Grzybowska, Dorota H, Petrykowski, Paweł M, García-March, Miguel Ángel, Lewenstein, Maciej, Muñoz-Gil, Gorka, Pozas-Kerstjens, Alejandro
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:It is widely known that Boltzmann machines are capable of representing arbitrary probability distributions over the values of their visible neurons, given enough hidden ones. However, sampling -- and thus training -- these models can be numerically hard. Recently we proposed a regularisation of the connections of Boltzmann machines, in order to control the energy landscape of the model, paving a way for efficient sampling and training. Here we formally prove that such regularised Boltzmann machines preserve the ability to represent arbitrary distributions. This is in conjunction with controlling the number of energy local minima, thus enabling easy \emph{guided} sampling and training. Furthermore, we explicitly show that regularised Boltzmann machines can store exponentially many arbitrarily correlated visible patterns with perfect retrieval, and we connect them to the Dense Associative Memory networks.
ISSN:2331-8422