Chrome Extension
WeChat Mini Program
Use on ChatGLM

A Lightweight Method for Graph Neural Networks Based on Knowledge Distillation and Graph Contrastive Learning

Yong Wang,Shuqun Yang

Applied Sciences(2024)

Cited 0|Views1
No score
Abstract
Graph neural networks (GNNs) are crucial tools for processing non-Euclidean data. However, due to scalability issues caused by the dependency and topology of graph data, deploying GNNs in practical applications is challenging. Some methods aim to address this issue by transferring GNN knowledge to MLPs through knowledge distillation. However, distilled MLPs cannot directly capture graph structure information and rely only on node features, resulting in poor performance and sensitivity to noise. To solve this problem, we propose a lightweight optimization method for GNNs that combines graph contrastive learning and variable-temperature knowledge distillation. First, we use graph contrastive learning to capture graph structural representations, enriching the input information for the MLP. Then, we transfer GNN knowledge to the MLP using variable temperature knowledge distillation. Additionally, we enhance both node content and structural features before inputting them into the MLP, thus improving its performance and stability. Extensive experiments on seven datasets show that the proposed KDGCL model outperforms baseline models in both transductive and inductive settings; in particular, the KDGCL model achieves an average improvement of 1.63% in transductive settings and 0.8% in inductive settings when compared to baseline models. Furthermore, KDGCL maintains parameter efficiency and inference speed, making it competitive in terms of performance.
More
Translated text
Key words
graph neural network,lightweight technology,knowledge distillation,graph contrastive learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined