Transfer Feature Generating Networks With Semantic Classes Structure for Zero-Shot Learning

IEEE ACCESS(2019)

引用 2|浏览8
暂无评分
摘要
Feature generating networks face a very important issue, which is the fitting difference (inconsistency) of the distribution between the generated feature and the real data. This inconsistency further influences the performance of the network model because training samples from seen classes are disjointed with testing samples from unseen classes in zero-shot learning (ZSL). In generalized zero-shot learning (GZSL), testing samples are from not only seen classes but also unseen classes to be closer to the practical situation. Therefore, most feature generating networks have difficulty achieving satisfactory performance for challenging GZSLs by adversarial learning the distribution of semantic classes. To alleviate the negative influence of this inconsistency for ZSL and GZSL, transfer feature generating networks with semantic classes structure (TFGNSCS) are proposed for constructing a network model to improve the performance of ZSL and GZSL. TFGNSCS not only can consider the semantic structure relationship between seen and unseen classes, but also can learn the difference of generating features by transferring classification model information from seen to unseen classes in networks. The proposed method can integrate the transfer loss, the classification loss and the Wasserstein distance loss to generate enough CNN features, on which softmax classifiers are trained for ZSL and GZSL. Experiments demonstrate that TFGNSCS outperforms state-of-the-art models on four challenging datasets: CUB, FLO, SUN, and AwA in GZSL.
更多
查看译文
关键词
Feature generating networks,semantic classes structure,transfer loss,zero-shot learning,generalized zero-shot learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要