Loading…
New, faster algorithms for supervised competitive learning : Counterpropagation and adaptive-resonance functionality
Hecht-Nielsen's counterpropagation networks often learn to associate input and output patterns more quickly than backpropagation networks. But simple competitive learning cannot separate closely spaced input patterns without adaptive-resonance-like (ART) functionality which prevents neighboring...
Saved in:
Published in: | Neural processing letters 1999-04, Vol.9 (2), p.107-117 |
---|---|
Main Author: | |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Hecht-Nielsen's counterpropagation networks often learn to associate input and output patterns more quickly than backpropagation networks. But simple competitive learning cannot separate closely spaced input patterns without adaptive-resonance-like (ART) functionality which prevents neighboring patterns from ‘stealing’ each other's templates. We demonstrate ‘pseudo-ART’ functionality with a new, simple, and very fast algorithm which requires no pattern normalization at all. Competition can be based on either Euclidean or L1-norm matching. In the latter case, the new algorithm emulates fuzzy ART. We apply the pseudo-ART scheme to several new types of counterpropagation networks, including one based on competition among combined input/output patterns, and discuss application with and without noise. |
---|---|
ISSN: | 1370-4621 1573-773X |
DOI: | 10.1023/A:1018665006640 |