Chrome Extension
WeChat Mini Program
Use on ChatGLM

Weak and strong convergence analysis of Elman neural networks via weight decay regularization

OPTIMIZATION(2023)

Cited 25|Views2
No score
Abstract
In this paper, we propose a novel variant of the algorithm to improve the generalization performance for Elman neural networks (ENN). Here, the weight decay term, also called L-2 regularization, which can effectively control the value of weights excessive growth, also over-fitting phenomenon can be effectively prevented. The main contribution of this work lies in that we have conducted a rigorous theoretical analysis of the proposed approach, i.e. the weak and strong convergence results are obtained. The comparison experiments to the problems of function approximation and classification on the real-world data have been performed to verify the theoretical results.
More
Translated text
Key words
Elman neural networks,L-2 regularization,gradient method,convergence
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined