SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation
arxiv(2023)
Abstract
Diffusion models based on permutation-equivariant networks can learn
permutation-invariant distributions for graph data. However, in comparison to
their non-invariant counterparts, we have found that these invariant models
encounter greater learning challenges since 1) their effective target
distributions exhibit more modes; 2) their optimal one-step denoising scores
are the score functions of Gaussian mixtures with more components. Motivated by
this analysis, we propose a non-invariant diffusion model, called
SwinGNN, which employs an efficient edge-to-edge 2-WL message
passing network and utilizes shifted window based self-attention inspired by
SwinTransformers. Further, through systematic ablations, we identify several
critical training and sampling techniques that significantly improve the sample
quality of graph generation. At last, we introduce a simple post-processing
trick, i.e., randomly permuting the generated graphs, which provably
converts any graph generative model to a permutation-invariant one. Extensive
experiments on synthetic and real-world protein and molecule datasets show that
our SwinGNN achieves state-of-the-art performances. Our code is released at
https://github.com/qiyan98/SwinGNN.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined