Loading…

Learning Calibrated Class Centers for Few-shot Classification by Pair-wise Similarity

Metric-based methods achieve promising performance on few-shot classification by learning clusters on support samples and generating shared decision boundaries for query samples. However, existing methods ignore the inaccurate class center approximation introduced by the limited number of support sa...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on image processing 2022, Vol.31, p.1-1
Main Authors: Guo, Yurong, Du, Ruoyi, Li, Xiaoxu, Xie, Jiyang, Ma, Zhanyu, Dong, Yuan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Metric-based methods achieve promising performance on few-shot classification by learning clusters on support samples and generating shared decision boundaries for query samples. However, existing methods ignore the inaccurate class center approximation introduced by the limited number of support samples, which consequently leads to biased inference. Therefore, in this paper, we propose to reduce the approximation error by class center calibration. Specifically, we introduce the so-called Pair-wise Similarity Module (PSM) to generate calibrated class centers adapted to the query sample by capturing the semantic correlations between the support and the query samples, as well as enhancing the discriminative regions on support representation. It is worth noting that the proposed PSM is a simple plug-and-play module and can be inserted into most metric-based few-shot learning models. Through extensive experiments in metric-based models, we demonstrate that the module significantly improves the performance of conventional few-shot classification methods on four few-shot image classification benchmark datasets. Codes are available at: https://github.com/PRIS-CV/Pair-wise-Similarity-module.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2022.3184813