In-situ learning in multilayer locally-connected memristive spiking neural network

Neurocomputing(2021)

引用 3|浏览22
暂无评分
摘要
Memristive spiking neural networks (MSNNs) have great potential to process information with higher efficiency and lower time latency than conventional artificial neural networks (ANNs). However, MSNNs still lack effective hardware-based training algorithms to achieve comparable performance to the mature ANNs. Therefore, a multilayer locally-connected (LC) MSNN is proposed to realize high performance with self-adaptive and in-situ learning. In the LC-MSNN, spatial and temporal interactions are introduced to activate hidden neurons spontaneously; synaptic weights are updated locally with spike-time-dependent plasticity (STDP) by pulse scheme including processing and updating phases; nonlinear conductance response (CR) is utilized to realize the adjustive learning rate. The LC-MSNN is comprehensively verified and benchmarked with the MNIST dataset. Moreover, self-adaptive activations of the hidden neurons are investigated by extracting and visualizing their internal states and related features; the adjustive learning rate is studied in different nonlinear CR. Effects of non-idealities including finite resolution, device-to-device variation, and yield, are also taken into consideration in the LC-MSNN. Simulation results show the LC-MSNN has better performance (maximum recognition rate of 97.4%) and robustness to non-idealities. Therefore, this method is a hardware-friendly algorithm and can be applied to realize high-performance SNNs in a memristor-based hardware system.
更多
查看译文
关键词
Memristor,SNNs,Locally-connected,Self-adaptive,In-situ,Adjustive learning rate,STDP
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要