Loading…
On the Basin of Attraction and Capacity of Restricted Hopfield Network as an Auto-Associative Memory
This paper introduces the eigenvalue interlacing theory and the concept of the condition number to analyze the structural characteristics and attraction basin radius generated by both the Hopfield Neural Network (HNN) and the Restricted Hopfield Network (RHN). Both networks can be viewed as higher-d...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper introduces the eigenvalue interlacing theory and the concept of the condition number to analyze the structural characteristics and attraction basin radius generated by both the Hopfield Neural Network (HNN) and the Restricted Hopfield Network (RHN). Both networks can be viewed as higher-dimensional dynamical systems that store patterns as fixed points. By studying two sets of AUTS cases, involving 35 and 63 nodes respectively, memorized in both HNN and RHN, we introduce the concept of the "effective condition number" as an indicator of the models' capacity. When the "effective condition number" surpasses a threshold, the HNN model becomes incapable of memorizing new patterns and forfeits all previously stored information when new patterns are added. In contrast, the RHN, when trained using Back-propagation Through Time (BPTT) or Subspace Rotation Algorithm (SRA) with appropriate weight initialization, consistently maintains an "effective condition number" close to one, thus, its capacity could increased with growing complexity of the model. For the sake of facilitating meaningful comparisons, the term "radius" is defined to objectively assess the performance of both HNN and RHN models. Experimental results demonstrate that the RHN model generally outperforms the HNN model in terms of both radius and the uniformity of attraction basins. Furthermore, this paper provides a brief discussion of how the capacity of the RHN tends to increase with a growing number of hidden nodes. Remarkably, within the capacity range of the RHN model, well-trained models with fewer hidden nodes exhibit larger attraction basin radii. |
---|---|
ISSN: | 2833-8898 |
DOI: | 10.1109/CyberC58899.2023.00033 |