In this letter, we obtain the absolute exponential stability result of neural networks with globally Lipschitz continuous, increasing and bounded activation functions under a sufficient condition which can unify some relevant sufficient ones for absolute stability in the literature. The obtained absolute exponential stability result generalizes the existing ones about absolute stability of neural networks. Moreover, it is demonstrated, by a mathematically rigorous proof, that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also presented to illustrate the analysis results.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Xue-Bin LIANG, Toru YAMAGUCHI, "On the Absolute Exponential Stability of Neural Networks with Globally Lipschitz Continuous Activation Functions" in IEICE TRANSACTIONS on Information,
vol. E80-D, no. 6, pp. 687-690, June 1997, doi: .
Abstract: In this letter, we obtain the absolute exponential stability result of neural networks with globally Lipschitz continuous, increasing and bounded activation functions under a sufficient condition which can unify some relevant sufficient ones for absolute stability in the literature. The obtained absolute exponential stability result generalizes the existing ones about absolute stability of neural networks. Moreover, it is demonstrated, by a mathematically rigorous proof, that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also presented to illustrate the analysis results.
URL: https://global.ieice.org/en_transactions/information/10.1587/e80-d_6_687/_p
Copy
@ARTICLE{e80-d_6_687,
author={Xue-Bin LIANG, Toru YAMAGUCHI, },
journal={IEICE TRANSACTIONS on Information},
title={On the Absolute Exponential Stability of Neural Networks with Globally Lipschitz Continuous Activation Functions},
year={1997},
volume={E80-D},
number={6},
pages={687-690},
abstract={In this letter, we obtain the absolute exponential stability result of neural networks with globally Lipschitz continuous, increasing and bounded activation functions under a sufficient condition which can unify some relevant sufficient ones for absolute stability in the literature. The obtained absolute exponential stability result generalizes the existing ones about absolute stability of neural networks. Moreover, it is demonstrated, by a mathematically rigorous proof, that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also presented to illustrate the analysis results.},
keywords={},
doi={},
ISSN={},
month={June},}
Copy
TY - JOUR
TI - On the Absolute Exponential Stability of Neural Networks with Globally Lipschitz Continuous Activation Functions
T2 - IEICE TRANSACTIONS on Information
SP - 687
EP - 690
AU - Xue-Bin LIANG
AU - Toru YAMAGUCHI
PY - 1997
DO -
JO - IEICE TRANSACTIONS on Information
SN -
VL - E80-D
IS - 6
JA - IEICE TRANSACTIONS on Information
Y1 - June 1997
AB - In this letter, we obtain the absolute exponential stability result of neural networks with globally Lipschitz continuous, increasing and bounded activation functions under a sufficient condition which can unify some relevant sufficient ones for absolute stability in the literature. The obtained absolute exponential stability result generalizes the existing ones about absolute stability of neural networks. Moreover, it is demonstrated, by a mathematically rigorous proof, that the network time constant is inversely proportional to the global exponential convergence rate of the network trajectories to the unique equilibrium. A numerical simulation example is also presented to illustrate the analysis results.
ER -