Loading…

Instance Correlation Graph for Unsupervised Domain Adaptation

In recent years, deep neural networks have emerged as a dominant machine learning tool for a wide variety of application fields. Due to the expensive cost of manual labeling efforts, it is important to transfer knowledge from a label-rich source domain to an unlabeled target domain. The core problem...

Full description

Saved in:
Bibliographic Details
Published in:ACM transactions on multimedia computing communications and applications 2022-02, Vol.18 (1s), p.1-23
Main Authors: Wu, Lei, Ling, Hefei, Shi, Yuxuan, Zhang, Baiyan
Format: Article
Language:English
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In recent years, deep neural networks have emerged as a dominant machine learning tool for a wide variety of application fields. Due to the expensive cost of manual labeling efforts, it is important to transfer knowledge from a label-rich source domain to an unlabeled target domain. The core problem is how to learn a domain-invariant representation to address the domain shift challenge, in which the training and test samples come from different distributions. First, considering the geometry of space probability distributions, we introduce an effective Hellinger Distance to match the source and target distributions on statistical manifold. Second, the data samples are not isolated individuals, and they are interrelated. The correlation information of data samples should not be neglected for domain adaptation. Distinguished from previous works, we pay attention to the correlation distributions over data samples. We design elaborately a Residual Graph Convolutional Network to construct the Instance Correlation Graph (ICG). The correlation information of data samples is exploited to reduce the domain shift. Therefore, a novel Instance Correlation Graph for Unsupervised Domain Adaptation is proposed, which is trained end-to-end by jointly optimizing three types of losses, i.e., Supervised Classification loss for source domain, Centroid Alignment loss to measure the centroid difference between source and target domain, ICG Alignment loss to match Instance Correlation Graph over two related domains. Extensive experiments are conducted on several hard transfer tasks to learn domain-invariant representations on three benchmarks: Office-31, Office-Home, and VisDA2017. Compared with other state-of-the-art techniques, our method achieves superior performance.
ISSN:1551-6857
1551-6865
DOI:10.1145/3486251