Loading…

Expansion window local alignment weighted network for fine-grained sketch-based image retrieval

Fine-Grained Sketch-Based Image Retrieval (FG-SBIR) is a worthwhile task, which can be useful in many scenarios like recommendation systems, receiving a great deal of attention. In this study, we analyze challenges faced in FG-SBIR and propose a novel Expansion Window Local Alignment Weighted Networ...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition 2023-12, Vol.144, p.109892, Article 109892
Main Authors: Zhang, Zi-Chao, Xie, Zhen-Yu, Chen, Zhen-Duo, Zhan, Yu-Wei, Luo, Xin, Xu, Xin-Shun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Fine-Grained Sketch-Based Image Retrieval (FG-SBIR) is a worthwhile task, which can be useful in many scenarios like recommendation systems, receiving a great deal of attention. In this study, we analyze challenges faced in FG-SBIR and propose a novel Expansion Window Local Alignment Weighted Network (EWLAW-Net). Specifically, it contains two main components: the Expansion Window Local Alignment module (EWLA) and the Local Weighted Fusion module (LWF). The EWLA module adopts an expansion window mechanism to align local features extracted from the backbone with the same semantic meaning between photos and sketches. The LWF module assigns weights to each local feature of the sketch after evaluating their importance and fuses them to calculate the similarity between the sketch and photos for retrieval. Experiments are conducted on five datasets and the results demonstrate the effectiveness of the proposed method. •Expansion Window Local Alignment Weighted Network is proposed for FG-SBIR.•Expansion Window Local Alignment module solves the spatial misalignment problem.•Local Weighted Fusion module evaluates the importance of local features.•EWLAW-Net’s superiority is demonstrated by ample experiments.
ISSN:0031-3203
DOI:10.1016/j.patcog.2023.109892