谷歌浏览器插件
订阅小程序
在清言上使用

RMP-Loss: Regularizing Membrane Potential Distribution for Spiking Neural Networks

2023 IEEE/CVF International Conference on Computer Vision (ICCV)(2023)

引用 7|浏览50
暂无评分
摘要
Spiking Neural Networks (SNNs) as one of the biology-inspired models have received much attention recently. It can significantly reduce energy consumption since they quantize the real-valued membrane potentials to 0/1 spikes to transmit information thus the multiplications of activations and weights can be replaced by additions when implemented on hardware. However, this quantization mechanism will inevitably introduce quantization error, thus causing catastrophic information loss. To address the quantization error problem, we propose a regularizing membrane potential loss (RMP-Loss) to adjust the distribution which is directly related to quantization error to a range close to the spikes. Our method is extremely simple to implement and straightforward to train an SNN. Furthermore, it is shown to consistently outperform previous state-of-the-art methods over different network architectures and datasets.
更多
查看译文
关键词
spiking,membrane,potential distribution,neural networks,rmp-loss
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要