Chrome Extension
WeChat Mini Program
Use on ChatGLM

A A 22nm 0.43pJ/SOP Sparsity-Aware In-Memory Neuromorphic Computing System with Hybrid Spiking and Artificial Neural Network and Configurable Topology

Ying Liu, Zhiyuan Chenl,Zhixuan Wang,Wentao Zhao,Wei He, Jianfeng Zhu, Oijun Wang, Ning Zhang,Tianyu Jia,Yufei Ma,Le Ye,Ru Huang

2023 IEEE Custom Integrated Circuits Conference (CICC)(2023)

Cited 2|Views32
No score
Abstract
Spiking neural networks (SNNs) dynamically process complex spatio temporal information as asynchronous and highly sparse spikes with high energy efficiency (EE). However, the training algorithms for nondifferentiable and discrete SNNs are still immature, leading to relatively low accuracy [1]. For instance, abnormal ECG detection is realized by SNN in [2] with 0. 53pJ/SOP EE, but the accuracy is only 90.5%. in [3], the on-chip learning of recurrent SNN for 1 -word keyword spotting (KWS) achieved only 90.7% accuracy. in contrast, artificial neural networks (ANNs) can reach excellent accuracy through gradient-based backpropagation (BP) training but require substantial energy consumption due to their intensive computations and memory accesses. A unified ANN-SNN architecture was proposed in [4] for high accuracy, but it sacrifices EE due to massive data movement and lack of sparsity utilization in SNN.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined