First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Peter GECZY, Shiro USUI, "Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term" in IEICE TRANSACTIONS on Fundamentals,
vol. E83-A, no. 11, pp. 2320-2328, November 2000, doi: .
Abstract: First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e83-a_11_2320/_p
Copy
@ARTICLE{e83-a_11_2320,
author={Peter GECZY, Shiro USUI, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term},
year={2000},
volume={E83-A},
number={11},
pages={2320-2328},
abstract={First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.},
keywords={},
doi={},
ISSN={},
month={November},}
Copy
TY - JOUR
TI - Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 2320
EP - 2328
AU - Peter GECZY
AU - Shiro USUI
PY - 2000
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E83-A
IS - 11
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - November 2000
AB - First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.
ER -