1-1hit |
Noritaka SHIGEI Hiromi MIYAJIMA Michiharu MAEDA
Learning algorithms for Vector Quantization (VQ) are categorized into two types: batch learning and incremental learning. Incremental learning is more useful than batch learning, because, unlike batch learning, incremental learning can be performed either on-line or off-line. In this paper, we develop effective incremental learning methods by using Stochastic Relaxation (SR) techniques, which have been developed for batch learning. It has been shown that, for batch learning, the SR techniques can provide good global optimization without greatly increasing the computational cost. We empirically investigates the effective implementation of SR for incremental learning. Specifically, we consider five types of SR methods: ISR1, ISR2, ISR3, WSR1 and WSR2. ISRs and WSRs add noise input and weight vectors, respectively. The difference among them is when the perturbed input or weight vectors are used in learning. These SR methods are applied to three types of incremental learning: K-means, Neural-Gas (NG) and Kohonen's Self-Organizing Mapping (SOM). We evaluate comprehensively these combinations in terms of accuracy and computation time. Our simulation results show that K-means with ISR3 is the most comprehensively effective among these combinations and is superior to the conventional NG method known as an excellent method.