Loading…

Expansive and competitive learning for vector quantization

In this paper, we develop a necessary and sufficient condition for a local minimum to be a global minimum to the vector quantization problem and present a competitive learning algorithm based on this condition which has two learning terms; the first term regulates the force of attraction between the...

Full description

Saved in:
Bibliographic Details
Published in:Neural processing letters 2002-06, Vol.15 (3), p.261-273
Main Authors: MUNOZ-PEREZ, J, GOMEZ-RUIZ, J. A, LOPEZ-RUBIO, E, GARCIA-BERNAL, M. A
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we develop a necessary and sufficient condition for a local minimum to be a global minimum to the vector quantization problem and present a competitive learning algorithm based on this condition which has two learning terms; the first term regulates the force of attraction between the synaptic weight vectors and the input patterns in order to reach a local minimum while the second term regulates the repulsion between the synaptic weight vectors and the input's gravity center to favor convergence to the global minimum This algorithm leads to optimal or near optimal solutions and it allows the network to escape from local minima during training. Experimental results in image compression demonstrate that it outperforms the simple competitive learning algorithm, giving better codebooks.
ISSN:1370-4621
1573-773X
DOI:10.1023/A:1015785501885