We apply Fahlman and Lebiere's (FL) algorithm to network synthesis and incremental learning by making use of already-trained networks, each performing a specified task, to design a system that performs a global or extended task without destroying the information gained by the previously trained nets. Investigation shows that the synthesized or expanded FL networks have generalization ability superior to Back propagation (BP) networks in which the number of newly added hidden units must be pre-specified.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Masanori HAMAMOTO, Joarder KAMRUZZAMAN, Yukio KUMAGAI, Hiromitsu HIKITA, "Incremental Learning and Generalization Ability of Artificial Neural Network Trained by Fahlman and Lebiere's Learning Algorithm" in IEICE TRANSACTIONS on Fundamentals,
vol. E76-A, no. 2, pp. 242-247, February 1993, doi: .
Abstract: We apply Fahlman and Lebiere's (FL) algorithm to network synthesis and incremental learning by making use of already-trained networks, each performing a specified task, to design a system that performs a global or extended task without destroying the information gained by the previously trained nets. Investigation shows that the synthesized or expanded FL networks have generalization ability superior to Back propagation (BP) networks in which the number of newly added hidden units must be pre-specified.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e76-a_2_242/_p
Copy
@ARTICLE{e76-a_2_242,
author={Masanori HAMAMOTO, Joarder KAMRUZZAMAN, Yukio KUMAGAI, Hiromitsu HIKITA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Incremental Learning and Generalization Ability of Artificial Neural Network Trained by Fahlman and Lebiere's Learning Algorithm},
year={1993},
volume={E76-A},
number={2},
pages={242-247},
abstract={We apply Fahlman and Lebiere's (FL) algorithm to network synthesis and incremental learning by making use of already-trained networks, each performing a specified task, to design a system that performs a global or extended task without destroying the information gained by the previously trained nets. Investigation shows that the synthesized or expanded FL networks have generalization ability superior to Back propagation (BP) networks in which the number of newly added hidden units must be pre-specified.},
keywords={},
doi={},
ISSN={},
month={February},}
Copy
TY - JOUR
TI - Incremental Learning and Generalization Ability of Artificial Neural Network Trained by Fahlman and Lebiere's Learning Algorithm
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 242
EP - 247
AU - Masanori HAMAMOTO
AU - Joarder KAMRUZZAMAN
AU - Yukio KUMAGAI
AU - Hiromitsu HIKITA
PY - 1993
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E76-A
IS - 2
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - February 1993
AB - We apply Fahlman and Lebiere's (FL) algorithm to network synthesis and incremental learning by making use of already-trained networks, each performing a specified task, to design a system that performs a global or extended task without destroying the information gained by the previously trained nets. Investigation shows that the synthesized or expanded FL networks have generalization ability superior to Back propagation (BP) networks in which the number of newly added hidden units must be pre-specified.
ER -