Chrome Extension
WeChat Mini Program
Use on ChatGLM

A new approach to training neural networks using natural gradient descent with momentum based on Dirichlet distributions br

R. I. Abdulkadirov,P. A. Lyakhov

COMPUTER OPTICS(2023)

Cited 2|Views6
No score
Abstract
In this paper, we propose a natural gradient descent algorithm with momentum based on Di-richlet distributions to speed up the training of neural networks. This approach takes into account not only the direction of the gradients, but also the convexity of the minimized function, which significantly accelerates the process of searching for the extremes. Calculations of natural gradi-ents based on Dirichlet distributions are presented, with the proposed approach introduced into an error backpropagation scheme. The results of image recognition and time series forecasting during the experiments show that the proposed approach gives higher accuracy and does not require a large number of iterations to minimize loss functions compared to the methods of stochastic gradi-ent descent, adaptive moment estimation and adaptive parameter-wise diagonal quasi-Newton method for nonconvex stochastic optimization.
More
Translated text
Key words
pattern recognition,machine learning,optimization,Dirichlet distributions,atural gradient descent
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined