Search to Fine-tune Pre-trained Graph Neural Networks for Graph-level Tasks
arxiv(2023)
摘要
Recently, graph neural networks (GNNs) have shown its unprecedented success
in many graph-related tasks. However, GNNs face the label scarcity issue as
other neural networks do. Thus, recent efforts try to pre-train GNNs on a
large-scale unlabeled graph and adapt the knowledge from the unlabeled graph to
the target downstream task. The adaptation is generally achieved by fine-tuning
the pre-trained GNNs with a limited number of labeled data. Despite the
importance of fine-tuning, current GNNs pre-training works often ignore
designing a good fine-tuning strategy to better leverage transferred knowledge
and improve the performance on downstream tasks. Only few works start to
investigate a better fine-tuning strategy for pre-trained GNNs. But their
designs either have strong assumptions or overlook the data-aware issue for
various downstream datasets. Therefore, we aim to design a better fine-tuning
strategy for pre-trained GNNs to improve the model performance in this paper.
Given a pre-trained GNN, we propose to search to fine-tune pre-trained graph
neural networks for graph-level tasks (S2PGNN), which adaptively design a
suitable fine-tuning framework for the given labeled data on the downstream
task. To ensure the improvement brought by searching fine-tuning strategy, we
carefully summarize a proper search space of fine-tuning framework that is
suitable for GNNs. The empirical studies show that S2PGNN can be implemented on
the top of 10 famous pre-trained GNNs and consistently improve their
performance. Besides, S2PGNN achieves better performance than existing
fine-tuning strategies within and outside the GNN area. Our code is publicly
available at .
更多查看译文
关键词
graph neural networks,neural networks,search,fine-tune,pre-trained,graph-level
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要