The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] superlinear convergence(2hit)

1-2hit
  • Novel First Order Optimization Classification Framework

    Peter GECZY  Shiro USUI  

     
    PAPER-Numerical Analysis and Optimization

      Vol:
    E83-A No:11
      Page(s):
    2312-2319

    Numerous scientific and engineering fields extensively utilize optimization techniques for finding appropriate parameter values of models. Various optimization methods are available for practical use. The optimization algorithms are classified primarily due to the rates of convergence. Unfortunately, it is often the case in practice that the particular optimization method with specified convergence rates performs substantially differently on diverse optimization tasks. Theoretical classification of convergence rates then lacks its relevance in the context of the practical optimization. It is therefore desirable to formulate a novel classification framework relevant to the theoretical concept of convergence rates as well as to the practical optimization. This article introduces such classification framework. The proposed classification framework enables specification of optimization techniques and optimization tasks. It also underlies its inherent relationship to the convergence rates. Novel classification framework is applied to categorizing the tasks of optimizing polynomials and the problem of training multilayer perceptron neural networks.

  • Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term

    Peter GECZY  Shiro USUI  

     
    PAPER-Numerical Analysis and Optimization

      Vol:
    E83-A No:11
      Page(s):
    2320-2328

    First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.