Loadingā¦
Neural Networks with a Redundant Representation: Detecting the Undetectable
We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4. The latter is known to be able to Hebbian store an amount of patterns scaling as N^{P-1}, where N denotes the number...
Saved in:
Published in: | Physical review letters 2020-01, Vol.124 (2), p.028301-028301, Article 028301 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4. The latter is known to be able to Hebbian store an amount of patterns scaling as N^{P-1}, where N denotes the number of constituting binary neurons interacting P wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of patterns scaling only linearly with N, while P>2) such a system is able to perform pattern recognition far below the standard signal-to-noise threshold. In particular, a network with P=4 is able to retrieve information whose intensity is O(1) even in the presence of a noise O(sqrt[N]) in the large N limit. This striking skill stems from a redundancy representation of patterns-which is afforded given the (relatively) low-load information storage-and it contributes to explain the impressive abilities in pattern recognition exhibited by new-generation neural networks. The whole theory is developed rigorously, at the replica symmetric level of approximation, and corroborated by signal-to-noise analysis and MonteĀ Carlo simulations. |
---|---|
ISSN: | 0031-9007 1079-7114 |
DOI: | 10.1103/PhysRevLett.124.028301 |