Towards Self-supervised Learning on Graphs with Heterophily

Conference on Information and Knowledge Management(2022)

引用 3|浏览25
暂无评分
摘要
ABSTRACTRecently emerged heterophilous graph neural networks have significantly reduced the reliance on the assumption of graph homophily where linked nodes have similar features and labels. These methods focus on a supervised setting that relies on labeling information heavily and presents the limitations on general graph downstream tasks. In this work, we propose a self-supervised representation learning paradigm on graphs with heterophily (namely HGRL) for improving the generalizability of node representations, where node representations are optimized without any label guidance. Inspired by the designs of existing heterophilous graph neural networks, HGRL learns the node representations by preserving the node original features and capturing informative distant neighbors. Such two properties are obtained through carefully designed pretext tasks that are optimized based on estimated high-order mutual information. Theoretical analysis interprets the connections between HGRL and existing advanced graph neural network designs. Extensive experiments on different downstream tasks demonstrate the effectiveness of the proposed framework.
更多
查看译文
关键词
graph neural networks, self-supervised learning, representation learning, heterophilous graph
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要