Loading…

A Bayesian attractor network with incremental learning

A realtime online learning system with capacity limits needs to gradually forget old information in order to avoid catastrophic forgetting. This can be achieved by allowing new information to overwrite old, as in a so-called palimpsest memory. This paper describes an incremental learning rule based...

Full description

Saved in:
Bibliographic Details
Published in:Network (Bristol) 2002-05, Vol.13 (2), p.179-194
Main Authors: Sandberg, A., Lansner, A., Petersson, K.M., Ekeberg
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A realtime online learning system with capacity limits needs to gradually forget old information in order to avoid catastrophic forgetting. This can be achieved by allowing new information to overwrite old, as in a so-called palimpsest memory. This paper describes an incremental learning rule based on the Bayesian confidence propagation neural network that has palimpsest properties when employed in an attractor neural network. The network does not suffer from catastrophic forgetting, has a capacity dependent on the learning time constant and exhibits faster convergence for newer patterns.
ISSN:0954-898X
1361-6536
DOI:10.1080/net.13.2.179.194