Jump to content
Wikipedia The Free Encyclopedia

Gradient method

From Wikipedia, the free encyclopedia

In optimization, a gradient method is an algorithm to solve problems of the form

min x R n f ( x ) {\displaystyle \min _{x\in \mathbb {R} ^{n}}\;f(x)} {\displaystyle \min _{x\in \mathbb {R} ^{n}}\;f(x)}

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.

See also

[edit ]

References

[edit ]
  • Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.
Functions
Gradients
Convergence
Quasi–Newton
Other methods
Hessians
General
Differentiable
Convex
minimization
Linear and
quadratic
Interior point
Basis- exchange
Paradigms
Graph
algorithms
Minimum
spanning tree
Shortest path
Network flows
Stub icon

This linear algebra-related article is a stub. You can help Wikipedia by expanding it.

AltStyle によって変換されたページ (->オリジナル) /