GraphERT- Transformers-based Temporal Dynamic Graph Embedding

PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023(2023)

引用 0|浏览11
暂无评分
摘要
Dynamic temporal graphs evolve over time, adding and removing nodes and edges between time snapshots. The tasks performed on such graphs are diverse and include detecting temporal trends, finding graph-to-graph similarities, and graph visualization and clustering. For all these tasks, it is necessary to embed the entire graph in a low-dimensional space by using graph-level representations instead of the more common node-level representations. This embedding requires handling the appearance of new nodes over time as well as capturing temporal patterns of the entire graph. Most existing methods perform temporal node embeddings and focus on different methods of aggregating them for a graph-based representation. In this work, we propose an end-to-end architecture that captures both the node embeddings and their influence in a structural context during a specific time period of the graph. We present GraphERT (Graph Embedding Representation using Transformers), a novel approach to temporal graph-level embeddings. Our method pioneers the use of Transformers to seamlessly integrate graph structure learning with temporal analysis. By employing a masked language model on sequences of graph random walks, together with a novel temporal classification task, our model not only comprehends the intricate graph dynamics but also unravels the temporal significance of each node and path. This novel training paradigm empowers GraphERT to capture the essence of both the structural and temporal aspects of graphs, surpassing state-of-the-art approaches across multiple tasks on real-world datasets.
更多
查看译文
关键词
graph neural networks,temporal graph embedding,social networks,natural language processing,time-series,anomaly detection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要