Parametric complexity analysis for a class of first-order Adagrad-like algorithms

CoRR(2022)

Cited 0|Views0
No score
Abstract
A class of algorithms for optimization in the presence of noise is presented, that does not require the evaluation of the objective function. This class generalizes the well-known Adagrad method. The complexity of this class is then analyzed as a function of its parameters, and it is shown that some methods of the class enjoy a better asymptotic convergence rate than previously known. A new class of algorithms is then derived with similar characteristics. Initial numerical experiments suggest that it may have some merits in practice.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined