The model selection for neural networks is an essential procedure to get not only high levels of generalization but also a compact data model. Especially in terms of getting the compact model, neural networks usually outperform other kinds of machine learning methods. Generally, models are selected by trial and error testing using whole learning samples given in advance. In many cases, however, it is difficult and time consuming to prepare whole learning samples in advance. To overcome these inconveniences, we propose a hybrid on-line learning system for a radial basis function (RBF) network that repeats quick learning of novel instances by rote during on-line periods (awake phases) and repeats pseudo rehearsal for model selection during out-of-service periods (sleep phases). We call this system Incremental Learning with Sleep (ILS). During sleep phases, the system basically stops the learning of novel instances, and during awake phases, the system responds quickly. We also extended the system so as to shorten the periodic sleep periods. Experimental results showed the system selects more compact data models than those selected by other machine learning systems.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Koichiro YAMAUCHI, Jiro HAYAMI, "Incremental Leaning and Model Selection for Radial Basis Function Network through Sleep" in IEICE TRANSACTIONS on Information,
vol. E90-D, no. 4, pp. 722-735, April 2007, doi: 10.1093/ietisy/e90-d.4.722.
Abstract: The model selection for neural networks is an essential procedure to get not only high levels of generalization but also a compact data model. Especially in terms of getting the compact model, neural networks usually outperform other kinds of machine learning methods. Generally, models are selected by trial and error testing using whole learning samples given in advance. In many cases, however, it is difficult and time consuming to prepare whole learning samples in advance. To overcome these inconveniences, we propose a hybrid on-line learning system for a radial basis function (RBF) network that repeats quick learning of novel instances by rote during on-line periods (awake phases) and repeats pseudo rehearsal for model selection during out-of-service periods (sleep phases). We call this system Incremental Learning with Sleep (ILS). During sleep phases, the system basically stops the learning of novel instances, and during awake phases, the system responds quickly. We also extended the system so as to shorten the periodic sleep periods. Experimental results showed the system selects more compact data models than those selected by other machine learning systems.
URL: https://global.ieice.org/en_transactions/information/10.1093/ietisy/e90-d.4.722/_p
Copy
@ARTICLE{e90-d_4_722,
author={Koichiro YAMAUCHI, Jiro HAYAMI, },
journal={IEICE TRANSACTIONS on Information},
title={Incremental Leaning and Model Selection for Radial Basis Function Network through Sleep},
year={2007},
volume={E90-D},
number={4},
pages={722-735},
abstract={The model selection for neural networks is an essential procedure to get not only high levels of generalization but also a compact data model. Especially in terms of getting the compact model, neural networks usually outperform other kinds of machine learning methods. Generally, models are selected by trial and error testing using whole learning samples given in advance. In many cases, however, it is difficult and time consuming to prepare whole learning samples in advance. To overcome these inconveniences, we propose a hybrid on-line learning system for a radial basis function (RBF) network that repeats quick learning of novel instances by rote during on-line periods (awake phases) and repeats pseudo rehearsal for model selection during out-of-service periods (sleep phases). We call this system Incremental Learning with Sleep (ILS). During sleep phases, the system basically stops the learning of novel instances, and during awake phases, the system responds quickly. We also extended the system so as to shorten the periodic sleep periods. Experimental results showed the system selects more compact data models than those selected by other machine learning systems.},
keywords={},
doi={10.1093/ietisy/e90-d.4.722},
ISSN={1745-1361},
month={April},}
Copy
TY - JOUR
TI - Incremental Leaning and Model Selection for Radial Basis Function Network through Sleep
T2 - IEICE TRANSACTIONS on Information
SP - 722
EP - 735
AU - Koichiro YAMAUCHI
AU - Jiro HAYAMI
PY - 2007
DO - 10.1093/ietisy/e90-d.4.722
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E90-D
IS - 4
JA - IEICE TRANSACTIONS on Information
Y1 - April 2007
AB - The model selection for neural networks is an essential procedure to get not only high levels of generalization but also a compact data model. Especially in terms of getting the compact model, neural networks usually outperform other kinds of machine learning methods. Generally, models are selected by trial and error testing using whole learning samples given in advance. In many cases, however, it is difficult and time consuming to prepare whole learning samples in advance. To overcome these inconveniences, we propose a hybrid on-line learning system for a radial basis function (RBF) network that repeats quick learning of novel instances by rote during on-line periods (awake phases) and repeats pseudo rehearsal for model selection during out-of-service periods (sleep phases). We call this system Incremental Learning with Sleep (ILS). During sleep phases, the system basically stops the learning of novel instances, and during awake phases, the system responds quickly. We also extended the system so as to shorten the periodic sleep periods. Experimental results showed the system selects more compact data models than those selected by other machine learning systems.
ER -