The search functionality is under construction.
The search functionality is under construction.

Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term

Peter GECZY, Shiro USUI

  • Full Text Views

    0

  • Cite this

Summary :

First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.

Publication
IEICE TRANSACTIONS on Fundamentals Vol.E83-A No.11 pp.2320-2328
Publication Date
2000/11/25
Publicized
Online ISSN
DOI
Type of Manuscript
PAPER
Category
Numerical Analysis and Optimization

Authors

Keyword