Chrome Extension
WeChat Mini Program
Use on ChatGLM

Fractional Gradient Descent Method for Spiking Neural Networks

2023 2ND CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, CFASTA(2023)

Cited 0|Views4
No score
Abstract
A fractional-order gradient descent method applicable to spiking neural networks(SNNs) is proposed for the problem of difficulty in training SNNs using stochastic gradient descent method. The method is an improvement of the location of the gradient backpropagation calculation in the training of SNNs and the output form of the last layer in the structure, respectively, and the convergence is proved using the fractional-order difference expression under the Griinwald-Letnikov definition. The test results on both pure SNN and CNN2SNN models simultaneously show that the method can effectively reduce the number of sample cycle inputs and thus the training time without reducing the accuracy.
More
Translated text
Key words
Spiking Neural Networks,Stochastic Gradient Descent Method,Fractional Order Difference
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined