Structural learning algorithms are obtained by adding a penalty criterion (usually comes from the network structure) to the conventional criterion of the sum of squared errors and applying the backpropagation (BP) algorithm. This problem can be viewed as a constrained minimization problem. In this paper, we apply the Lagrangian differential gradient method to the structural learning based on the backpropagation-like algorithm. Computational experiments for both artificial and real data show that the improvement of generalization performance and the network optimization are obtained applying the proposed method.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Rameswar DEBNATH, Haruhisa TAKAHASHI, "A New Approach to the Structural Learning of Neural Networks" in IEICE TRANSACTIONS on Fundamentals,
vol. E87-A, no. 6, pp. 1655-1658, June 2004, doi: .
Abstract: Structural learning algorithms are obtained by adding a penalty criterion (usually comes from the network structure) to the conventional criterion of the sum of squared errors and applying the backpropagation (BP) algorithm. This problem can be viewed as a constrained minimization problem. In this paper, we apply the Lagrangian differential gradient method to the structural learning based on the backpropagation-like algorithm. Computational experiments for both artificial and real data show that the improvement of generalization performance and the network optimization are obtained applying the proposed method.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e87-a_6_1655/_p
Copy
@ARTICLE{e87-a_6_1655,
author={Rameswar DEBNATH, Haruhisa TAKAHASHI, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={A New Approach to the Structural Learning of Neural Networks},
year={2004},
volume={E87-A},
number={6},
pages={1655-1658},
abstract={Structural learning algorithms are obtained by adding a penalty criterion (usually comes from the network structure) to the conventional criterion of the sum of squared errors and applying the backpropagation (BP) algorithm. This problem can be viewed as a constrained minimization problem. In this paper, we apply the Lagrangian differential gradient method to the structural learning based on the backpropagation-like algorithm. Computational experiments for both artificial and real data show that the improvement of generalization performance and the network optimization are obtained applying the proposed method.},
keywords={},
doi={},
ISSN={},
month={June},}
Copy
TY - JOUR
TI - A New Approach to the Structural Learning of Neural Networks
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1655
EP - 1658
AU - Rameswar DEBNATH
AU - Haruhisa TAKAHASHI
PY - 2004
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E87-A
IS - 6
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - June 2004
AB - Structural learning algorithms are obtained by adding a penalty criterion (usually comes from the network structure) to the conventional criterion of the sum of squared errors and applying the backpropagation (BP) algorithm. This problem can be viewed as a constrained minimization problem. In this paper, we apply the Lagrangian differential gradient method to the structural learning based on the backpropagation-like algorithm. Computational experiments for both artificial and real data show that the improvement of generalization performance and the network optimization are obtained applying the proposed method.
ER -