The search functionality is under construction.
The search functionality is under construction.

Author Search Result

[Author] Xue-Bin LIANG(8hit)

1-8hit
  • On the Analysis of Global and Absolute Stability of Nonlinear Continuous Neural Networks

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    PAPER-Neural Networks

      Vol:
    E80-A No:1
      Page(s):
    223-229

    This paper obtains some new results about the existence, uniqueness, and global asymptotic stability of the equilibrium of a nonlinear continuous neural network, under a sufficient condition weaker than ones presented in the literature. The avobe obtained results can also imply the existing ones about avsolute stability of nonlinear continuous neural networks

  • On the Absolute Exponential Stability of Neural Networks with Globally Lipschitz Continuous Activation Functions

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    LETTER-Bio-Cybernetics and Neurocomputing

      Vol:
    E80-D No:6
      Page(s):
    687-690

    In this letter, we obtain the absolute exponential stability result of neural networks with globally Lipschitz continuous, increasing and bounded activation functions under a sufficient condition which can unify some relevant sufficient ones for absolute stability in the literature. The obtained absolute exponential stability result generalizes the existing ones about absolute stability of neural networks. Moreover, it is demonstrated, by a mathematically rigorous proof, that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also presented to illustrate the analysis results.

  • A Remark on a Class of Stability Conditions for Neural Networks

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    LETTER-Bio-Cybernetics and Neurocomputing

      Vol:
    E79-D No:7
      Page(s):
    1004-1005

    This letter points out that while a class of conditions presented in Matsuoka K. [1] are truly sufficient for absolute stability of neural networks, the proof of the sufficiency given in [1] is not sound. As a remark, a mathematically rigorous proof of the sufficiency of the class of conditions for absolute stability of neural networks is provided.

  • Necessary and Sufficient Condition for Absolute Exponential Stability of a Class of Nonsymmetric Neural Networks

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    PAPER-Bio-Cybernetics and Neurocomputing

      Vol:
    E80-D No:8
      Page(s):
    802-807

    In this paper, we prove that for a class of nonsymmetric neural networks with connection matrices T having nonnegative off-diagonal entries, -T is an M-matrix is a necessary and sufficient condition for absolute exponential stability of the network belonging to this class. While this result extends the existing one of absolute stability in Forti, et al., its proof given in this paper is simpler, which is completed by an approach different from one used in Forti, et al. The most significant consequence is that the class of nonsymmetric neural networks with connection matrices T satisfying -T is an M-matrix is the largest class of nonsymmetric neural networks that can be employed for embedding and solving optimization problem with global exponential rate of convergence to the optimal solution and without the risk of spurious responses. An illustrating numerical example is also given.

  • Absolute Exponential Stability of Neural Networks with Asymmetric Connection Matrices

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    LETTER-Neural Networks

      Vol:
    E80-A No:8
      Page(s):
    1531-1534

    In this letter, the absolute exponential stability result of neural networks with asymmetric connection matrices is obtained, which generalizes the existing one about absolute stability of neural networks, by a new proof approach. It is demonstrated that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also given to illustrate the obtained analysis results.

  • Optimal Design of Hopfield-Type Associative Memory by Adaptive Stability-Growth Method

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    LETTER-Bio-Cybernetics and Neurocomputing

      Vol:
    E81-D No:1
      Page(s):
    148-150

    An adaptive stability-growth (ASG) learning algorithm is proposed for improving, as much as possible, the stability of a Hopfield-type associative memory. While the ASG algorithm can be used to determine the optimal stability instead of the well-known minimum-overlap (MO) learning algorithm with sufficiently large lower bound for MO value, it converges much more quickly than the MO algorithm in real implementation. Therefore, the proposed ASG algorithm is more suitable than the MO algorithm for real-world design of an optimal Hopfield-type associative memory.

  • On the Global Asymptotic Stability Independent of Delay of Neural Networks

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    LETTER-Neural Networks

      Vol:
    E80-A No:1
      Page(s):
    247-250

    Recurrent neural networks have the potential of performing parallel computation for associative memory and optimization, which is realized by the electronic implementation of neural networks in VLSI technology. Since the time delays in real electronic implementation of neural networks are unavoidably encountered and they can cause systems to oscillate, it is thus practically important to investigate the qualitative properties of neural networks with time delays. In this paper, a class of sufficient conditions is obtained, under which neural networks are globally asymptotically stable independent of time delays.

  • Necessary and Sufficient Condition for Absolute Exponential Stability of Hopfield-Type Neural Networks

    Xue-Bin LIANG  Toru YAMAGUCHI  

     
    PAPER-Bio-Cybernetics and Neurocomputing

      Vol:
    E79-D No:7
      Page(s):
    990-993

    A main result in this paper is that for a Hopfield-type neural circuit with a symmetric connection matrix T, the negative semidenfiniteness of T is a necessary and sufficient condition for absolute exponential stability. While this result extends one of absolute stability in Forti, et al. [1], its proof given in this paper is simpler, which is completed by an approach different from one used in Forti et al. [1]. The most significant consequence is that the class of neural networks with negative semidefinite matrices T is the largest class of symmetric networks that can be employed for embedding and solving optimization problem with global exponential rate of convergence to the optimal solution and without the risk of spurious responses.