Improved zeroing neural models based on two novel activation functions with exponential behavior

THEORETICAL COMPUTER SCIENCE(2024)

Cited 0|Views34
No score
Abstract
A family of zeroing neural networks based on new nonlinear activation functions is proposed for solving various time-varying linear matrix equations (TVLME). The proposed neural network dynamical systems, symbolized as Li-VPZNN1 and Li-VPZNN2, include an exponential parameter in nonlinear activation function (AF) that leads to faster convergence to the theoretical result compared to previous categories of nonlinearly activated neural networks. Theoretical analysis as well as numerical tests in MATLAB's environment confirm the efficiency and accelerated convergence property of the novel dynamics.
More
Translated text
Key words
Zhang neural network,Time-varying matrix,Matrix inverse,Hyperpower iterative methods
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined