Loading…
Efficient dynamic domain adaptation on deep CNN
Domain adaptation is widely-used in deep neural networks to address the problem of data distribution shift. Most of the deep CNN models use the Maximum Mean Discrepancy(MMD) to measure the distribution difference between the source and task domains, which have achieved great success on transfer lear...
Saved in:
Published in: | Multimedia tools and applications 2020-12, Vol.79 (45-46), p.33853-33873 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Domain adaptation is widely-used in deep neural networks to address the problem of data distribution shift. Most of the deep CNN models use the Maximum Mean Discrepancy(MMD) to measure the distribution difference between the source and task domains, which have achieved great success on transfer learning tasks. However, these conventional domain adaptation methods have limited use in dynamic knowledge adaptation due to the constant transfer coefficient for all adaptation layers. In this paper, we propose an efficient dynamic domain adaptation method on deep CNN models, which transfers knowledge dynamically according to different layers and training extent. Specifically, we first present a deep understanding of how transferable each layer in various deep CNN models is, including the different VGG and AlexNet architectures. Then, a dynamic transfer coefficient is proposed based on the deep understanding. By doing this, our paper gives guidance on how to choose transferring layers and adaptation coefficients aptly instead of using empirical constant transfer parameters in conventional methods. Extensive experiments conducted on standard benchmark datasets demonstrate that the proposed method achieves the state-of-the-art results by using dynamic domain adaptation parameters compared with conventional methods. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-019-08584-z |