Chrome Extension
WeChat Mini Program
Use on ChatGLM

On Some Works of Boris Teodorovich Polyak on the Convergence of Gradient Methods and Their Development

Computational Mathematics and Mathematical Physics(2024)

Cited 0|Views4
No score
Abstract
The paper presents a review of the current state of subgradient and accelerated convex optimization methods, including the cases with the presence of noise and access to various information about the objective function (function value, gradient, stochastic gradient, higher derivatives). For nonconvex problems, the Polyak–Lojasiewicz condition is considered and a review of the main results is given. The behavior of numerical methods in the presence of a sharp minimum is considered. The aim of this review is to show the influence of the works of B.T. Polyak (1935–2023) on gradient optimization methods and their surroundings on the modern development of numerical optimization methods.
More
Translated text
Key words
gradient descent,gradient dominance condition (Polyak–Lojasiewicz),sharp minimum,subgradient Polyak–Shor method,early stopping condition,Polyak heavy ball method,stochastic gradient descent
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined