A A 22nm 0.43pJ/SOP Sparsity-Aware In-Memory Neuromorphic Computing System with Hybrid Spiking and Artificial Neural Network and Configurable Topology
2023 IEEE Custom Integrated Circuits Conference (CICC)(2023)
Abstract
Spiking neural networks (SNNs) dynamically process complex spatio temporal information as asynchronous and highly sparse spikes with high energy efficiency (EE). However, the training algorithms for nondifferentiable and discrete SNNs are still immature, leading to relatively low accuracy [1]. For instance, abnormal ECG detection is realized by SNN in [2] with 0. 53pJ/SOP EE, but the accuracy is only 90.5%. in [3], the on-chip learning of recurrent SNN for 1 -word keyword spotting (KWS) achieved only 90.7% accuracy. in contrast, artificial neural networks (ANNs) can reach excellent accuracy through gradient-based backpropagation (BP) training but require substantial energy consumption due to their intensive computations and memory accesses. A unified ANN-SNN architecture was proposed in [4] for high accuracy, but it sacrifices EE due to massive data movement and lack of sparsity utilization in SNN.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined