A High Speed Convergent Formula for Time-Variant Generalized Sylvester Equation Solving

2023 5th International Conference on Industrial Artificial Intelligence (IAI)(2023)

引用 0|浏览3
暂无评分
摘要
The construction of novel and faster categories of recurrent neural networks (RNN) is an important and interesting field of study in domains, which are related with engineering and applied mathematics problems. Different nonlinear odd and increasing functions called activation functions (AFs) have been used for the acceleration of the convergence speed of each RNN formula. In the context of this research a faster model based on a novel AF will be constructed and applied for time-varying generalized Sylvester equation (TVGSE) solving, while theoretical convergence analysis and simulation experiment in MATLAB’s environment, will show the effectiveness of the proposed formula.
更多
查看译文
关键词
zeroing neural network,predefined convergence,Lyapunov candidate,linear algebra,control theory,activation function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要