Memory-efficient Transformer-based network model for Traveling Salesman Problem.

Neural networks : the official journal of the International Neural Network Society(2023)

引用 3|浏览46
暂无评分
摘要
Combinatorial optimization problems such as Traveling Salesman Problem (TSP) have a wide range of real-world applications in transportation, logistics, manufacturing. It has always been a difficult problem to solve large-scale TSP problems quickly because of memory usage limitations. Recent research shows that the Transformer model is a promising approach. However, the Transformer has several severe problems that prevent it from quickly solving TSP combinatorial optimization problems, such as quadratic time complexity, especially quadratic space complexity, and the inherent limitations of the encoder and decoder itself. To address these issues, we developed a memory-efficient Transformer-based network model for TSP combinatorial optimization problems, termed Tspformer, with two distinctive characteristics: (1) a sampled scaled dot-product attention mechanism with O(Llog(L)) (L is the length of input sequences) time and space complexity, which is the most different between our work and other works. (2) due to the reduced space complexity, GPU/CPU memory usage is significantly reduced. Extensive experiments demonstrate that Tspformer significantly outperforms existing methods and provides a new solution to the TSP combinatorial optimization problems. Our Pytorch code will be publicly available on GitHub https://github.com/yhnju/tspFormer.
更多
查看译文
关键词
Combinatorial optimization,Deep reinforcement learning,TSP,Transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要