Domain-invariant feature learning with label information integration for cross-domain classification

Neural Computing and Applications(2024)

引用 0|浏览6
暂无评分
摘要
Traditional methods for unsupervised cross-domain classification learn a common low-dimensional subspace using images from a well-labeled source domain and an unlabeled target domain. To achieve domain-invariant feature extraction, researchers typically focus on minimizing the distribution discrepancy. However, these methods often overlook the fact that label information contains critical categorization information for discriminative subspace learning. This paper proposes a novel method, named domain-invariant feature learning with label information integration (DILI), which integrates metric learning and label information extraction to learn a cross-domain discriminant subspace. DILI first reduces the distances between the source and target domains to mitigate the marginal distribution discrepancy. Then, it reduces the distances between cross-domain samples from the same class to mitigate the conditional distribution discrepancy. Dual terms are imposed to balance the label information of both domains to learn common features, and a discriminant subspace is learned for cross-domain tasks. This method obtains domain-invariant features, while the label information of the source domain and pseudo labels from the target domain are used to improve the discriminant of subspaces. Experimental results on eight cross-domain datasets show that DILI outperforms some state-of-the-art methods.
更多
查看译文
关键词
Cross-domain classification,Feature extraction,Subspace learning,Label information
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要