Loading…

Semi-Global Context Network for Semantic Correspondence

Estimating semantic correspondence between pairs of images can be challenging as a result of intra-class variation, background clutter, and repetitive patterns. This paper proposes a convolutional neural network (CNN) that attempts to learn rich semantic representations that contain the global seman...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2021, Vol.9, p.2496-2507
Main Authors: Lee, Ho-Jun, Choi, Hong Tae, Park, Sung Kyu, Park, Ho-Hyun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Estimating semantic correspondence between pairs of images can be challenging as a result of intra-class variation, background clutter, and repetitive patterns. This paper proposes a convolutional neural network (CNN) that attempts to learn rich semantic representations that contain the global semantic context to enable robust semantic correspondence estimation against intra-class variation and repetitive patterns. We introduce a global context fused feature representation that efficiently employs the global semantic context in estimating semantic correspondence as well as a semi-global self-similarity feature to reduce background clutter-induced distraction in capturing the global semantic context. The proposed network is trained in an end-to-end manner using a weakly supervised loss, which requires a weak level of supervision involving annotation on image pairs. This weakly supervised loss is supplemented with a historical averaging loss to effectively train the network. Our approach decreases running time by a factor of more than four and reduces the training memory requirement by a factor of three and produces competitive or superior results relative to previous approaches on the PF-PASCAL, PF-WILLOW, and TSS benchmarks.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3046845