Chrome Extension
WeChat Mini Program
Use on ChatGLM

General Derivative-Free Optimization Methods under Global and Local Lipschitz Continuity of Gradients

arxiv(2023)

Cited 0|Views0
No score
Abstract
This paper addresses the study of derivative-free smooth optimization problems, where the gradient information on the objective function is unavailable. Two novel general derivative-free methods are proposed and developed for minimizing such functions with either global or local Lipschitz continuous gradients. The newly developed methods use gradient approximations based on finite differences, where finite difference intervals are automatically adapted to the magnitude of the exact gradients without knowing them exactly. The suggested algorithms achieve fundamental convergence results, including stationarity of accumulation points in general settings as well as global convergence with constructive convergence rates when the Kurdyka-\L ojasiewicz property is imposed. The local convergence of the proposed algorithms to nonisolated local minimizers, along with their local convergence rates, is also analyzed under this property. Numerical experiences involving various convex, nonconvex, noiseless, and noisy functions demonstrate that the new methods exhibit essential advantages over other state-of-the-art methods in derivative-free optimization.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined