Loading…
Adaptive Online k-Subspaces with Cooperative Re-Initialization
We propose a simple but principled cooperative re-initialization (CoRe) approach to k-subspaces, which also applies to k-means by viewing it as a particular case. CoRe optimizes an ensemble of identical k-subspace models and leverages their aggregate knowledge by greedily exchanging clusters through...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We propose a simple but principled cooperative re-initialization (CoRe) approach to k-subspaces, which also applies to k-means by viewing it as a particular case. CoRe optimizes an ensemble of identical k-subspace models and leverages their aggregate knowledge by greedily exchanging clusters throughout optimization. Further, we introduce an adaptive k-subspaces formulation with split low-rank regularization designed to adapt both the number of subspaces and their dimensions. Moreover, we present a highly scalable online algorithm based on stochastic gradient descent. In experiments on synthetic and real image data, we show that our proposed CoRe method significantly improves upon the standard probabilistic farthest insertion (i.e. k-means++) initialization approach-particularly when k is large. We further demonstrate the improved robustness of our proposed formulation, and the scalability and improved optimization performance of our SGD-based algorithm. |
---|---|
ISSN: | 2473-9944 |
DOI: | 10.1109/ICCVW.2019.00082 |