Deep Domain Adaptation under Label Scarcity

arXiv: Learning(2021)

引用 0|浏览27
暂无评分
摘要
The goal behind Domain Adaptation (DA) is to leverage the labeled examples from a source domain to infer an accurate model for a target domain where labels are not available or in scarce at the best. Recently, there has been a surge in adversarial learning based deep-net approaches for DA problem - a prominent example being DANN approach [9]. These methods require a large number of source labeled examples to infer a good model for the target domain; but start performing poorly with reduced labels. In this paper, we study the behavior of such approaches (especially DANN) under such scarce label scenarios. Further, we propose an architecture, namely TRAVERS, that amalgamates TRAnsductive learning principles with adVERSarial learning so as to provide a cushion to the performance of these approaches under label scarcity. Experimental results (both on text and images) show a significant boost in the performance of TRAVERS over approaches such as DANN under scarce label scenarios.
更多
查看译文
关键词
domain adaptation, adversarial learning, cross domain representation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要