Contrastive fine-tuning for low-resource graph-level transfer learning

Yutai Duan,Jie Liu,Shaowei Chen, Jianhua Wu

INFORMATION SCIENCES(2024)

引用 0|浏览3
暂无评分
摘要
Due to insufficient supervision and the gap between pre-training pretext tasks and downstream tasks, transferring pre-trained graph neural networks (GNNs) to downstream tasks in low-resource scenarios remains challenging. In this paper, a Contrastive Fine-tuning (Con-tuning) framework is proposed for low-resource graph-level transfer learning, and a graph-level supervised contrastive learning (SCL) task is designed within the framework as the first attempt to introduce SCL for finetuning processes of pre-trained GNNs. The SCL task compensates for the insufficient supervision problem in low-resource scenarios and narrows the gap between pretext tasks and downstream tasks. To further reinforce the supervision signal in the SCL task, we devise a graphon theory based labeled graph generator to extract the generalized knowledge of a specific class of graphs. Based on this knowledge, graph-level templates are generated for each class and used as contrastive samples in the SCL task. Then, the proposed Con-tuning framework jointly learns the SCL task and downstream tasks to effectively fine-tune the pre-trained GNNs for downstream tasks. Extensive experiments with eight real-world datasets show that Con-tuning framework enables pre-trained GNNs to achieve better performance on graph-level downstream tasks in low-resource settings.
更多
查看译文
关键词
Graph neural networks,Graph-level tasks,Low-resource scenarios,Transfer learning,Contrastive learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要