PCGraph: Accelerating GNN Inference on Large Graphs via Partition Caching
2021 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom)(2021)
Abstract
Graph neural networks (GNNs) have been emerging as powerful learning tools for unstructured data and successfully applied to many graph-based application domains. Sampling-based GNN inference is commonly adopted in existing graph learning frameworks to handle large-scale graphs. However, this approach is restricted by the problems of redundant vertex embedding computation in GPU and inefficient lo...
MoreTranslated text
Key words
graph neural networks,inference,embedding computation,feature caching,pipeline parallel
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined