DeNKD: Decoupled non-target Knowledge Distillation for Complementing Transformer-based Unsupervised Domain Adaptation

IEEE Transactions on Circuits and Systems for Video Technology(2023)

引用 0|浏览26
暂无评分
摘要
There is a growing need to explore the potential of transformers in Unsupervised Domain Adaptation (UDA) due to their increasing success in various vision tasks. However, the application of transformers in UDA has yet to be thoroughly investigated and requires further research. In this study, our primary focus is to design a novel pipeline specifically tailored for transformer-based UDA, to address a crucial challenge: the overemphasis on the transfer of target -oriented information, mainly caused by the self-attention blocks in transformers and the cross-domain adversarial learning scheme. First, we show that non-target information, including semantic contextual information such as background features and non-target classes, must be addressed in the domain adaptation process. Recognizing the importance of incorporating non-target knowledge, we propose a decoupled non-target knowledge distillation method called DeNKD. DeNKD decouples non-target information across domains at both feature and logit levels. This decoupling is achieved through a bi-directional knowledge distillation approach that facilitates the interaction and exchange of non-target knowledge to facilitate an effective transformer-based cross-domain knowledge transfer. We perform extensive evaluations on several well-established UDA benchmark datasets. The results consistently show that DeNKD outperforms other methods, achieving the best performance across the board. For example, on the Office-Home dataset, DeNKD achieves an accuracy of 85.54%, while on the VisDA-2017 dataset, it achieves an accuracy of 89.95%. These results highlight the effectiveness of DeNKD in transformer-based UDA and its potential for improving cross-domain adaptation performance.
更多
查看译文
关键词
Transformer,Unsupervised domain adaptation,Knowledge distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要