Several Guaranteed Descent Conjugate Gradient Methods for Unconstrained Optimization.

JOURNAL OF APPLIED MATHEMATICS(2014)

Cited 2|Views2
No score
Abstract
This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent condition g(k)(T)d(k) <= -(1 - 1/4 theta(k))) parallel to g(k)parallel to(2) (theta(k) > 1/4) and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large- scale unconstrained optimization.
More
Translated text
Key words
conjugate gradient,optimization
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined