This letter describes the concepts that the learnability of multilayer neural networks exists in a constrained hypersurface in learning space which is formed by input and output subspace of multilayer neural networks, and that a priori information, providing constraints on the learning space, is required for generalization.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Yasuo ITOH, "Considerations of Learnability of Multilayer Neural Networks" in IEICE TRANSACTIONS on Fundamentals,
vol. E76-A, no. 2, pp. 239-241, February 1993, doi: .
Abstract: This letter describes the concepts that the learnability of multilayer neural networks exists in a constrained hypersurface in learning space which is formed by input and output subspace of multilayer neural networks, and that a priori information, providing constraints on the learning space, is required for generalization.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e76-a_2_239/_p
Copy
@ARTICLE{e76-a_2_239,
author={Yasuo ITOH, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Considerations of Learnability of Multilayer Neural Networks},
year={1993},
volume={E76-A},
number={2},
pages={239-241},
abstract={This letter describes the concepts that the learnability of multilayer neural networks exists in a constrained hypersurface in learning space which is formed by input and output subspace of multilayer neural networks, and that a priori information, providing constraints on the learning space, is required for generalization.},
keywords={},
doi={},
ISSN={},
month={February},}
Copy
TY - JOUR
TI - Considerations of Learnability of Multilayer Neural Networks
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 239
EP - 241
AU - Yasuo ITOH
PY - 1993
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E76-A
IS - 2
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - February 1993
AB - This letter describes the concepts that the learnability of multilayer neural networks exists in a constrained hypersurface in learning space which is formed by input and output subspace of multilayer neural networks, and that a priori information, providing constraints on the learning space, is required for generalization.
ER -