This paper proposes a practical training algorithm for artificial neural networks, by which both the optimally pruned model and the optimally trained parameter for the minimum prediction error can be found simultaneously. In the proposed algorithm, the conventional information criterion is modified into a differentiable function of weight parameters, and then it is minimized while being controlled back to the conventional form. Since this method has several theoretical problems, its effectiveness is examined by computer simulations and by an application to practical ultrasonic image reconstruction.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Sumio WATANABE, "A Modified Information Criterion for Automatic Model and Parameter Selection in Neural Network Learning" in IEICE TRANSACTIONS on Information,
vol. E78-D, no. 4, pp. 490-499, April 1995, doi: .
Abstract: This paper proposes a practical training algorithm for artificial neural networks, by which both the optimally pruned model and the optimally trained parameter for the minimum prediction error can be found simultaneously. In the proposed algorithm, the conventional information criterion is modified into a differentiable function of weight parameters, and then it is minimized while being controlled back to the conventional form. Since this method has several theoretical problems, its effectiveness is examined by computer simulations and by an application to practical ultrasonic image reconstruction.
URL: https://global.ieice.org/en_transactions/information/10.1587/e78-d_4_490/_p
Copy
@ARTICLE{e78-d_4_490,
author={Sumio WATANABE, },
journal={IEICE TRANSACTIONS on Information},
title={A Modified Information Criterion for Automatic Model and Parameter Selection in Neural Network Learning},
year={1995},
volume={E78-D},
number={4},
pages={490-499},
abstract={This paper proposes a practical training algorithm for artificial neural networks, by which both the optimally pruned model and the optimally trained parameter for the minimum prediction error can be found simultaneously. In the proposed algorithm, the conventional information criterion is modified into a differentiable function of weight parameters, and then it is minimized while being controlled back to the conventional form. Since this method has several theoretical problems, its effectiveness is examined by computer simulations and by an application to practical ultrasonic image reconstruction.},
keywords={},
doi={},
ISSN={},
month={April},}
Copy
TY - JOUR
TI - A Modified Information Criterion for Automatic Model and Parameter Selection in Neural Network Learning
T2 - IEICE TRANSACTIONS on Information
SP - 490
EP - 499
AU - Sumio WATANABE
PY - 1995
DO -
JO - IEICE TRANSACTIONS on Information
SN -
VL - E78-D
IS - 4
JA - IEICE TRANSACTIONS on Information
Y1 - April 1995
AB - This paper proposes a practical training algorithm for artificial neural networks, by which both the optimally pruned model and the optimally trained parameter for the minimum prediction error can be found simultaneously. In the proposed algorithm, the conventional information criterion is modified into a differentiable function of weight parameters, and then it is minimized while being controlled back to the conventional form. Since this method has several theoretical problems, its effectiveness is examined by computer simulations and by an application to practical ultrasonic image reconstruction.
ER -