Chrome Extension
WeChat Mini Program
Use on ChatGLM

Efficiency Study of SSAG on RNN Framework 2022

Xiaowei Xie,Aixiang Chen

2022 IEEE 6TH ADVANCED INFORMATION TECHNOLOGY, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (IAEAC)(2022)

Cited 0|Views1
No score
Abstract
SGD (Stochastic gradient descent) is widely used in deep learning, however SGD cannot get linear convergence and is not effective in large amounts of data. This paper use SSAG to improve the efficiency. SSAG contains two optimization strategies, one is stratified sampling strategy and the other is historical gradient averaging strategy. It has the advantages of fast convergence of variance, flexible application to big data, and easy work in deep network. This paper studies the efficiency of SSAG gradient optimization algorithm based on RNN framework. The proposed RNN framework comprises a feature extraction layer, a stacked RNN layer, and a transcription layer. The experimental results confirm that the accuracy of SSAG is better than the SGD and the Momentum. Both stratified sampling and historical averaging strategies have the effect of improving task accuracy. Experimental results verified that SSAG has better effect in image classification task.
More
Translated text
Key words
SSAG,RNN framework,stratified sampling,historical gradient
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined