Group Scissor: Scaling Neuromorphic Computing Design to Large Neural Networks.

DAC(2017)

引用 39|浏览72
暂无评分
摘要
Synapse crossbar is an elementary structure in neuromorphic computing systems (NCS). However, the limited size of crossbars and heavy routing congestion impede the NCS implementation of large neural networks. In this paper, we propose a two-step framework (namely, group scissor) to scale NCS designs to large neural networks. The first step rank clipping integrates low-rank approximation into the training to reduce total crossbar area. The second step is group connection deletion, which structurally prunes connections to reduce routing congestion between crossbars. Tested on convolutional neural networks of LeNet on MNIST database and ConvNet on CIFAR-10 database, our experiments show significant reduction of crossbar and routing area in NCS designs. Without accuracy loss, rank clipping reduces the total crossbar area to 13.62% or 51.81% in the NCS design of LeNet or ConvNet, respectively. The following group connection deletion further decreases the routing area of LeNet or ConvNet to 8.1% or 52.06%, respectively.
更多
查看译文
关键词
Neuromorphic computing, large neural networks, low-rank approximation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要