The search functionality is under construction.

Keyword Search Result

[Keyword] absolute exponential stability(4hit)

1-4hit
  • Necessary and Sufficient Condition for Absolute Exponential Stability of a Class of Nonsymmetric Neural Networks

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    PAPER-Bio-Cybernetics and Neurocomputing

      Vol:
    E80-D No:8
      Page(s):
    802-807

    In this paper, we prove that for a class of nonsymmetric neural networks with connection matrices T having nonnegative off-diagonal entries, -T is an M-matrix is a necessary and sufficient condition for absolute exponential stability of the network belonging to this class. While this result extends the existing one of absolute stability in Forti, et al., its proof given in this paper is simpler, which is completed by an approach different from one used in Forti, et al. The most significant consequence is that the class of nonsymmetric neural networks with connection matrices T satisfying -T is an M-matrix is the largest class of nonsymmetric neural networks that can be employed for embedding and solving optimization problem with global exponential rate of convergence to the optimal solution and without the risk of spurious responses. An illustrating numerical example is also given.

  • Absolute Exponential Stability of Neural Networks with Asymmetric Connection Matrices

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    LETTER-Neural Networks

      Vol:
    E80-A No:8
      Page(s):
    1531-1534

    In this letter, the absolute exponential stability result of neural networks with asymmetric connection matrices is obtained, which generalizes the existing one about absolute stability of neural networks, by a new proof approach. It is demonstrated that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also given to illustrate the obtained analysis results.

  • On the Absolute Exponential Stability of Neural Networks with Globally Lipschitz Continuous Activation Functions

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    LETTER-Bio-Cybernetics and Neurocomputing

      Vol:
    E80-D No:6
      Page(s):
    687-690

    In this letter, we obtain the absolute exponential stability result of neural networks with globally Lipschitz continuous, increasing and bounded activation functions under a sufficient condition which can unify some relevant sufficient ones for absolute stability in the literature. The obtained absolute exponential stability result generalizes the existing ones about absolute stability of neural networks. Moreover, it is demonstrated, by a mathematically rigorous proof, that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also presented to illustrate the analysis results.

  • Necessary and Sufficient Condition for Absolute Exponential Stability of Hopfield-Type Neural Networks

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    PAPER-Bio-Cybernetics and Neurocomputing

      Vol:
    E79-D No:7
      Page(s):
    990-993

    A main result in this paper is that for a Hopfield-type neural circuit with a symmetric connection matrix T, the negative semidenfiniteness of T is a necessary and sufficient condition for absolute exponential stability. While this result extends one of absolute stability in Forti, et al. [1], its proof given in this paper is simpler, which is completed by an approach different from one used in Forti et al. [1]. The most significant consequence is that the class of neural networks with negative semidefinite matrices T is the largest class of symmetric networks that can be employed for embedding and solving optimization problem with global exponential rate of convergence to the optimal solution and without the risk of spurious responses.