Loading…
Improving the Generalization Ability of Deep Neural Networks for Cross-Domain Visual Recognition
Feature learning with deep neural networks (DNNs) has made remarkable progress in recent years. However, its data-driven nature makes the collection of labeled training data expensive or impossible when the testing domain changes. Here, we propose a method of transferable feature learning and instan...
Saved in:
Published in: | IEEE transactions on cognitive and developmental systems 2021-09, Vol.13 (3), p.607-620 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Feature learning with deep neural networks (DNNs) has made remarkable progress in recent years. However, its data-driven nature makes the collection of labeled training data expensive or impossible when the testing domain changes. Here, we propose a method of transferable feature learning and instance-level adaptation to improve the generalization ability of DNNs so as to mitigate the domain shift challenge for cross-domain visual recognition. When less labeled information is available, our proposed method shows attractive results in the new target domain and outperforms the typical fine-tuning method. Two DNNs are chosen as the representatives working with our proposed method, to do a comprehensive study about the generalization ability on the tasks of image-to-image transfer, image-to-video transfer, multidomain image classification, and weakly supervised detection. The experimental results show that our proposed method is superior to other existing works in the literature. In addition, a large scale of cross-domain database is merged from three different domains, providing a quantitative platform to evaluate different approaches in the field of cross-domain object detection. |
---|---|
ISSN: | 2379-8920 2379-8939 |
DOI: | 10.1109/TCDS.2020.2965166 |