Loading…
Two-dimensional k-subspace clustering and its applications on image recognition
Image clustering plays an important role in computer vision and machine learning. However, most of the existing clustering algorithms flatten the image into one-dimensional vector as an image representation for subsequent learning without fully considering the spatial relationship between pixels, wh...
Saved in:
Published in: | International journal of machine learning and cybernetics 2023-08, Vol.14 (8), p.2671-2683 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Image clustering plays an important role in computer vision and machine learning. However, most of the existing clustering algorithms flatten the image into one-dimensional vector as an image representation for subsequent learning without fully considering the spatial relationship between pixels, which may lose some useful intrinsic structural information of the matrix data samples and result in high computational complexity. In this paper, we propose a novel two-dimensional
k
-subspace clustering (2D
k
SC). By projecting data samples into a discriminant low-dimensional space, 2D
k
SC maximizes the between-cluster difference and meanwhile minimizes within-cluster distance of matrix data samples in the projected space, thus dimensionality reduction and clustering can be realized simultaneously. The weight between the between-cluster and within-cluster terms is derived from a Bhattacharyya upper bound, which is determined by the involved input data samples. This weighting constant makes the proposed 2D
k
SC adaptive without setting any parameters, which improves the computational efficiency. Moreover, 2D
k
SC can be effectively solved by a standard eigenvalue decomposition problem. Experimental results on three different types of image datasets show that 2DkSC achieves the best clustering results in terms of average clustering accuracy and average normalized mutual information, which demonstrates the superiority of the proposed method. |
---|---|
ISSN: | 1868-8071 1868-808X |
DOI: | 10.1007/s13042-023-01790-0 |