Loading…

Unsupervised deep triplet hashing with pseudo triplets for scalable image retrieval

Deep learning based hashing methods have been proven to be effective in the field of image retrieval recently. Among them, most high-performance methods are supervised frameworks, which require annotated labels by humans. Considering the difficulty of labeling large-scale image datasets, unsupervise...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications 2020-12, Vol.79 (47-48), p.35253-35274
Main Authors: Gu, Yifan, Zhang, Haofeng, Zhang, Zheng, Ye, Qiaolin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep learning based hashing methods have been proven to be effective in the field of image retrieval recently. Among them, most high-performance methods are supervised frameworks, which require annotated labels by humans. Considering the difficulty of labeling large-scale image datasets, unsupervised methods, which just need images themselves for training, are more suitable for practical applications. However, how to improve the discriminative ability of hash codes generated by unsupervised models still remains as a challenging problem. In this paper, we present a novel deep framework called Unsupervised Deep Triplet Hashing (UDTH) for scalable image retrieval. UDTH builds pseudo triplets based on the neighborhood structure in the high-dimensional visual feature space, and then solves two problems through the proposed objective function: 1) Triplet network is utilized to maximize the distance between different classes of binary representation; 2) Autoencoder and Binary quantization are exploited to learn hash codes which maintain the structural information of original samples. Extensive experiments on the datasets of CIFAR-10, NUS-WIDE and MIRFLICKR-25K are conducted, and the results show that our proposed UDTH is superior to the state-of-the-art methods.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-019-7687-0