Fahlman and Lebiere's (FL) learning algorithm begins with a two-layer network and in course of training, can construct various network architectures. We applied FL algorithm to the same three-layer network architecture as a back propagation (BP) network and compared their generalization properties. Simulation results show that FL algorithm yields excellent saturation of hidden units which can not be achieved by BP algorithm and furthermore, has more desirable generalization ability than that of BP algorithm.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Masanori HAMAMOTO, Joarder KAMRUZZAMAN, Yukio KUMAGAI, Hiromitsu HIKITA, "Generalization Ability of Feedforward Neural Network Trained by Fahlman and Lebiere's Learning Algorithm" in IEICE TRANSACTIONS on Fundamentals,
vol. E75-A, no. 11, pp. 1597-1601, November 1992, doi: .
Abstract: Fahlman and Lebiere's (FL) learning algorithm begins with a two-layer network and in course of training, can construct various network architectures. We applied FL algorithm to the same three-layer network architecture as a back propagation (BP) network and compared their generalization properties. Simulation results show that FL algorithm yields excellent saturation of hidden units which can not be achieved by BP algorithm and furthermore, has more desirable generalization ability than that of BP algorithm.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e75-a_11_1597/_p
Copy
@ARTICLE{e75-a_11_1597,
author={Masanori HAMAMOTO, Joarder KAMRUZZAMAN, Yukio KUMAGAI, Hiromitsu HIKITA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Generalization Ability of Feedforward Neural Network Trained by Fahlman and Lebiere's Learning Algorithm},
year={1992},
volume={E75-A},
number={11},
pages={1597-1601},
abstract={Fahlman and Lebiere's (FL) learning algorithm begins with a two-layer network and in course of training, can construct various network architectures. We applied FL algorithm to the same three-layer network architecture as a back propagation (BP) network and compared their generalization properties. Simulation results show that FL algorithm yields excellent saturation of hidden units which can not be achieved by BP algorithm and furthermore, has more desirable generalization ability than that of BP algorithm.},
keywords={},
doi={},
ISSN={},
month={November},}
Copy
TY - JOUR
TI - Generalization Ability of Feedforward Neural Network Trained by Fahlman and Lebiere's Learning Algorithm
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1597
EP - 1601
AU - Masanori HAMAMOTO
AU - Joarder KAMRUZZAMAN
AU - Yukio KUMAGAI
AU - Hiromitsu HIKITA
PY - 1992
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E75-A
IS - 11
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - November 1992
AB - Fahlman and Lebiere's (FL) learning algorithm begins with a two-layer network and in course of training, can construct various network architectures. We applied FL algorithm to the same three-layer network architecture as a back propagation (BP) network and compared their generalization properties. Simulation results show that FL algorithm yields excellent saturation of hidden units which can not be achieved by BP algorithm and furthermore, has more desirable generalization ability than that of BP algorithm.
ER -