Hierarchical Knowledge Propagation and Distillation for Few-Shot Learning.

Neural networks : the official journal of the International Neural Network Society(2023)

引用 0|浏览33
暂无评分
摘要
Recent research efforts on Few-Shot Learning (FSL) have achieved extensive progress. However, the existing efforts primarily focus on the transductive setting of FSL, which is heavily challenged by the limited quantity of the unlabeled query set. Although a few inductive-based FSL methods have been studied, most of them emphasize learning superb feature extraction networks. As a result, they may ignore the relations between sample-level and class-level representations, which are particularly crucial when labeled samples are scarce. This paper proposes an inductive FSL framework that leverages the Hierarchical Knowledge Propagation and Distillation, named HKPD. To learn more discriminative sample-level representations, HKPD first constructs a sample-level information propagation module that explores pairwise sample relations. Subsequently, a class-level information propagation module is designed to obtain and update the class-level information. Moreover, a self-distillation module is adopted to further improve the learned representations by propagating the obtained knowledge across this hierarchical architecture. Extensive experiments conducted on the commonly used few-shot benchmark datasets demonstrate the superiority of the proposed HKPD method, which outperforms the current state-of-the-art methods.
更多
查看译文
关键词
Few-Shot Learning,Knowledge Distillation,Inductive learning,Feature representation,Classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要