Weak and strong convergence analysis of Elman neural networks via weight decay regularization
OPTIMIZATION(2023)
Abstract
In this paper, we propose a novel variant of the algorithm to improve the generalization performance for Elman neural networks (ENN). Here, the weight decay term, also called L-2 regularization, which can effectively control the value of weights excessive growth, also over-fitting phenomenon can be effectively prevented. The main contribution of this work lies in that we have conducted a rigorous theoretical analysis of the proposed approach, i.e. the weak and strong convergence results are obtained. The comparison experiments to the problems of function approximation and classification on the real-world data have been performed to verify the theoretical results.
MoreTranslated text
Key words
Elman neural networks,L-2 regularization,gradient method,convergence
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined