FedGT: Federated Node Classification with Scalable Graph Transformer
CoRR(2024)
摘要
Graphs are widely used to model relational data. As graphs are getting larger
and larger in real-world scenarios, there is a trend to store and compute
subgraphs in multiple local systems. For example, recently proposed
subgraph federated learning methods train Graph Neural Networks (GNNs)
distributively on local subgraphs and aggregate GNN parameters with a central
server. However, existing methods have the following limitations: (1) The links
between local subgraphs are missing in subgraph federated learning. This could
severely damage the performance of GNNs that follow message-passing paradigms
to update node/edge features. (2) Most existing methods overlook the subgraph
heterogeneity issue, brought by subgraphs being from different parts of the
whole graph. To address the aforementioned challenges, we propose a scalable
Federated Graph Transformer (FedGT) in the
paper. Firstly, we design a hybrid attention scheme to reduce the complexity of
the Graph Transformer to linear while ensuring a global receptive field with
theoretical bounds. Specifically, each node attends to the sampled local
neighbors and a set of curated global nodes to learn both local and global
information and be robust to missing links. The global nodes are dynamically
updated during training with an online clustering algorithm to capture the data
distribution of the corresponding local subgraph. Secondly, FedGT computes
clients' similarity based on the aligned global nodes with optimal transport.
The similarity is then used to perform weighted averaging for personalized
aggregation, which well addresses the data heterogeneity problem. Moreover,
local differential privacy is applied to further protect the privacy of
clients. Finally, extensive experimental results on 6 datasets and 2 subgraph
settings demonstrate the superiority of FedGT.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要