Applying the formerly proposed classification framework for first order line search optimization techniques we introduce novel superlinear first order line search methods. Novelty of the methods lies in the line search subproblem. The presented line search subproblem features automatic step length and momentum adjustments at every iteration of the algorithms realizable in a single step calculation. This keeps the computational complexity of the algorithms linear and does not harm the stability and convergence of the methods. The algorithms have none or linear memory requirements and are shown to be convergent and capable of reaching the superlinear convergence rates. They were practically applied to artificial neural network training and compared to the relevant training methods within the same class. The simulation results show satisfactory performance of the introduced algorithms over the standard and previously proposed methods.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Peter GECZY, Shiro USUI, "Novel Superlinear First Order Algorithms" in IEICE TRANSACTIONS on Fundamentals,
vol. E87-A, no. 6, pp. 1620-1631, June 2004, doi: .
Abstract: Applying the formerly proposed classification framework for first order line search optimization techniques we introduce novel superlinear first order line search methods. Novelty of the methods lies in the line search subproblem. The presented line search subproblem features automatic step length and momentum adjustments at every iteration of the algorithms realizable in a single step calculation. This keeps the computational complexity of the algorithms linear and does not harm the stability and convergence of the methods. The algorithms have none or linear memory requirements and are shown to be convergent and capable of reaching the superlinear convergence rates. They were practically applied to artificial neural network training and compared to the relevant training methods within the same class. The simulation results show satisfactory performance of the introduced algorithms over the standard and previously proposed methods.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e87-a_6_1620/_p
Copy
@ARTICLE{e87-a_6_1620,
author={Peter GECZY, Shiro USUI, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Novel Superlinear First Order Algorithms},
year={2004},
volume={E87-A},
number={6},
pages={1620-1631},
abstract={Applying the formerly proposed classification framework for first order line search optimization techniques we introduce novel superlinear first order line search methods. Novelty of the methods lies in the line search subproblem. The presented line search subproblem features automatic step length and momentum adjustments at every iteration of the algorithms realizable in a single step calculation. This keeps the computational complexity of the algorithms linear and does not harm the stability and convergence of the methods. The algorithms have none or linear memory requirements and are shown to be convergent and capable of reaching the superlinear convergence rates. They were practically applied to artificial neural network training and compared to the relevant training methods within the same class. The simulation results show satisfactory performance of the introduced algorithms over the standard and previously proposed methods.},
keywords={},
doi={},
ISSN={},
month={June},}
Copy
TY - JOUR
TI - Novel Superlinear First Order Algorithms
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1620
EP - 1631
AU - Peter GECZY
AU - Shiro USUI
PY - 2004
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E87-A
IS - 6
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - June 2004
AB - Applying the formerly proposed classification framework for first order line search optimization techniques we introduce novel superlinear first order line search methods. Novelty of the methods lies in the line search subproblem. The presented line search subproblem features automatic step length and momentum adjustments at every iteration of the algorithms realizable in a single step calculation. This keeps the computational complexity of the algorithms linear and does not harm the stability and convergence of the methods. The algorithms have none or linear memory requirements and are shown to be convergent and capable of reaching the superlinear convergence rates. They were practically applied to artificial neural network training and compared to the relevant training methods within the same class. The simulation results show satisfactory performance of the introduced algorithms over the standard and previously proposed methods.
ER -