GraphEdit: Large Language Models for Graph Structure Learning
CoRR(2024)
摘要
Graph Structure Learning (GSL) focuses on capturing intrinsic dependencies
and interactions among nodes in graph-structured data by generating novel graph
structures. Graph Neural Networks (GNNs) have emerged as promising GSL
solutions, utilizing recursive message passing to encode node-wise
inter-dependencies. However, many existing GSL methods heavily depend on
explicit graph structural information as supervision signals, leaving them
susceptible to challenges such as data noise and sparsity. In this work, we
propose GraphEdit, an approach that leverages large language models (LLMs) to
learn complex node relationships in graph-structured data. By enhancing the
reasoning capabilities of LLMs through instruction-tuning over graph
structures, we aim to overcome the limitations associated with explicit graph
structural information and enhance the reliability of graph structure learning.
Our approach not only effectively denoises noisy connections but also
identifies node-wise dependencies from a global perspective, providing a
comprehensive understanding of the graph structure. We conduct extensive
experiments on multiple benchmark datasets to demonstrate the effectiveness and
robustness of GraphEdit across various settings. We have made our model
implementation available at: https://github.com/HKUDS/GraphEdit.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要