Decoupled Differentiable Graph Neural Architecture Search

Information Sciences(2024)

引用 0|浏览2
暂无评分
摘要
The differentiable graph neural architecture search (GNAS) effectively designs graph neural networks (GNNs) efficiently and automatically with excellent performance based on different graph data distributions. Given a GNN search space containing multiple GNN component operation candidates, the differentiable GNAS method builds a mixed supernet using learnable architecture parameters multiplied by the GNN component operation candidates. When the mixed supernet completes optimization, the mixed supernet is pruned based on the best architecture parameters to efficiently identify the optimal GNN architecture in the GNN search space. However, the multiplicative relationship between the architecture parameters and the GNN component operation candidates introduces a coupled optimization bias into the weight optimization process of the mixed supernet GNN component operation candidates. This bias results in differentiable GNAS performance degradation. To solve the problem of coupled optimization bias in the previous differentiable GNAS method, we propose the Decoupled Differentiable Graph Neural Architecture Search (D2GNAS). It utilizes the Gumbel distribution as a bridge to decouple the weights optimization of supernet GNN component candidate operation and architecture parameters for constructing the decoupled differentiable GNN architecture sampler. The sampler is capable of selecting promising GNN architectures based on architecture parameters treated as sampling probabilities, and it is further optimized through the validation gradients derived from the sampled GNN architectures. Simultaneously, D2GNAS builds a single-path supernet with a pruning strategy to compress the supernet progressively to improve search efficiency further. We conduct extensive experiments on multiple benchmark graphs. The experimental findings demonstrate that D2GNAS outperforms all established baseline methods, both manual GNN and GNAS methods, in terms of performance. Additionally, D2GNAS has a lower time complexity than previous differentiable GNAS methods. Based on the fair GNN search space, it achieves an average 5x efficiency improvement. Codes are available at https://github.com/AutoMachine0/D2GNAS.
更多
查看译文
关键词
Graph neural architecture search,Decoupled differentiable optimization,Supernet pruning,Graph neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要