On the Two Sides of Redundancy in Graph Neural Networks
arxiv(2023)
摘要
Message passing neural networks iteratively generate node embeddings by
aggregating information from neighboring nodes. With increasing depth,
information from more distant nodes is included. However, node embeddings may
be unable to represent the growing node neighborhoods accurately and the
influence of distant nodes may vanish, a problem referred to as oversquashing.
Information redundancy in message passing, i.e., the repetitive exchange and
encoding of identical information amplifies oversquashing. We develop a novel
aggregation scheme based on neighborhood trees, which allows for controlling
redundancy by pruning redundant branches of unfolding trees underlying standard
message passing. While the regular structure of unfolding trees allows the
reuse of intermediate results in a straightforward way, the use of neighborhood
trees poses computational challenges. We propose compact representations of
neighborhood trees and merge them, exploiting computational redundancy by
identifying isomorphic subtrees. From this, node and graph embeddings are
computed via a neural architecture inspired by tree canonization techniques.
Our method is less susceptible to oversquashing than traditional message
passing neural networks and can improve the accuracy on widely used benchmark
datasets.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要