A Hardware Accelerator of the Convolutional Spike Neural Network Based on STDP Online Learning

2024 Conference of Science and Technology for Integrated Circuits (CSTIC)(2024)

引用 0|浏览2
暂无评分
摘要
Convolutional Neural Networks (CNNs) excel in object recognition but are computationally intensive and unsuitable for online learning. Spiking Neural Networks (SNNs) [1] have low computational requirements, fast processing, and low power consumption, making them suitable for online learning. This paper proposes a hardware accelerator for a Convolutional Spiking Neural Networks (Conv-SNNs) which combines the strengths of CNNs and SNNs by introducing spiking mechanisms into the convolutional layers. It utilizes Spike Timing Dependent Plasticity (STDP) and Reward-STDP (R-STDP) algorithms for training, achieving 95% accuracy on MNIST dataset. In addition, it also supports the few-shot learning. Implemented on Xilinx's ZCU102 FPGA at a 100MHz clock frequency, the system achieves 0.16s for inference and 0.177s for training per image, 16x faster than CPU. The power consumption is 3.979W, two orders of magnitude lower than CPU.
更多
查看译文
关键词
Neural Network,Convolutional Neural Network,Online Learning,Spiking Neural Networks,Hardware Accelerators,Spike-timing-dependent Plasticity,Convolutional Layers,Power Consumption,Object Recognition,Low Power Consumption,MNIST Dataset,Clock Frequency,Low Computational Requirements,Learning Algorithms,Learning Rate,Simulation Software,Deep Convolutional Neural Network,Pooling Layer,Control Mode,Digital Networks,Reward Signal,Presynaptic Neurons,Difference Of Gaussian,Learning Rule,Synaptic Weights,Spike Data,Label Prediction,Postsynaptic Neurons,Feature Extraction Capability,Input Spike
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要