A new training algorithm for the chaotic Adachi Neural Network (AdNN) is investigated. The classical training algorithm for the AdNN and it's variants is usually a “one-shot” learning, for example, the Outer Product Rule (OPR) is the most used. Although the OPR is effective for conventional neural networks, its effectiveness and adequateness for Chaotic Neural Networks (CNNs) have not been discussed formally. As a complementary and tentative work in this field, we modified the AdNN's weights by enforcing an unsupervised Hebbian rule. Experimental analysis shows that the new weighted AdNN yields even stronger dynamical associative memory and pattern recognition phenomena for different settings than the primitive AdNN.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Guangchun LUO, Jinsheng REN, Ke QIN, "Dynamical Associative Memory: The Properties of the New Weighted Chaotic Adachi Neural Network" in IEICE TRANSACTIONS on Information,
vol. E95-D, no. 8, pp. 2158-2162, August 2012, doi: 10.1587/transinf.E95.D.2158.
Abstract: A new training algorithm for the chaotic Adachi Neural Network (AdNN) is investigated. The classical training algorithm for the AdNN and it's variants is usually a “one-shot” learning, for example, the Outer Product Rule (OPR) is the most used. Although the OPR is effective for conventional neural networks, its effectiveness and adequateness for Chaotic Neural Networks (CNNs) have not been discussed formally. As a complementary and tentative work in this field, we modified the AdNN's weights by enforcing an unsupervised Hebbian rule. Experimental analysis shows that the new weighted AdNN yields even stronger dynamical associative memory and pattern recognition phenomena for different settings than the primitive AdNN.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E95.D.2158/_p
Copy
@ARTICLE{e95-d_8_2158,
author={Guangchun LUO, Jinsheng REN, Ke QIN, },
journal={IEICE TRANSACTIONS on Information},
title={Dynamical Associative Memory: The Properties of the New Weighted Chaotic Adachi Neural Network},
year={2012},
volume={E95-D},
number={8},
pages={2158-2162},
abstract={A new training algorithm for the chaotic Adachi Neural Network (AdNN) is investigated. The classical training algorithm for the AdNN and it's variants is usually a “one-shot” learning, for example, the Outer Product Rule (OPR) is the most used. Although the OPR is effective for conventional neural networks, its effectiveness and adequateness for Chaotic Neural Networks (CNNs) have not been discussed formally. As a complementary and tentative work in this field, we modified the AdNN's weights by enforcing an unsupervised Hebbian rule. Experimental analysis shows that the new weighted AdNN yields even stronger dynamical associative memory and pattern recognition phenomena for different settings than the primitive AdNN.},
keywords={},
doi={10.1587/transinf.E95.D.2158},
ISSN={1745-1361},
month={August},}
Copy
TY - JOUR
TI - Dynamical Associative Memory: The Properties of the New Weighted Chaotic Adachi Neural Network
T2 - IEICE TRANSACTIONS on Information
SP - 2158
EP - 2162
AU - Guangchun LUO
AU - Jinsheng REN
AU - Ke QIN
PY - 2012
DO - 10.1587/transinf.E95.D.2158
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E95-D
IS - 8
JA - IEICE TRANSACTIONS on Information
Y1 - August 2012
AB - A new training algorithm for the chaotic Adachi Neural Network (AdNN) is investigated. The classical training algorithm for the AdNN and it's variants is usually a “one-shot” learning, for example, the Outer Product Rule (OPR) is the most used. Although the OPR is effective for conventional neural networks, its effectiveness and adequateness for Chaotic Neural Networks (CNNs) have not been discussed formally. As a complementary and tentative work in this field, we modified the AdNN's weights by enforcing an unsupervised Hebbian rule. Experimental analysis shows that the new weighted AdNN yields even stronger dynamical associative memory and pattern recognition phenomena for different settings than the primitive AdNN.
ER -