谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Darwin3: a large-scale neuromorphic chip with a novel ISA and on-chip learning

De Ma, Xiaofei Jin, Shichun Sun, Yitao Li,Xundong Wu, Youneng Hu,Fangchao Yang,Huajin Tang,Xiaolei Zhu,Peng Lin,Gang Pan

NATIONAL SCIENCE REVIEW(2024)

引用 0|浏览26
暂无评分
摘要
Spiking neural networks (SNNs) are gaining increasing attention for their biological plausibility and potential for improved computational efficiency. To match the high spatial-temporal dynamics in SNNs, neuromorphic chips are highly desired to execute SNNs in hardware-based neuron and synapse circuits directly. This paper presents a large-scale neuromorphic chip named Darwin3 with a novel instruction set architecture, which comprises 10 primary instructions and a few extended instructions. It supports flexible neuron model programming and local learning rule designs. The Darwin3 chip architecture is designed in a mesh of computing nodes with an innovative routing algorithm. We used a compression mechanism to represent synaptic connections, significantly reducing memory usage. The Darwin3 chip supports up to 2.35 million neurons, making it the largest of its kind on the neuron scale. The experimental results showed that the code density was improved by up to 28.3x in Darwin3, and that the neuron core fan-in and fan-out were improved by up to 4096x and 3072x by connection compression compared to the physical memory depth. Our Darwin3 chip also provided memory saving between 6.8x and 200.8x when mapping convolutional spiking neural networks onto the chip, demonstrating state-of-the-art performance in accuracy and latency compared to other neuromorphic chips. Darwin3, the largest neuromorphic chip currently available, can support up to 2.35 million neurons and 100 million synapses. It presents an efficient neuromorphic instruction set, enabling flexible neuron model programming and on-chip learning.
更多
查看译文
关键词
neuromorphic computing,spiking neural network,instruction set architecture,connectivity compression
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要