Copy
Michiharu MAEDA, Hiromi MIYAJIMA, Sadayuki MURASHIMA, "An Adaptive Learning and Self-Deleting Neural Network for Vector Quantization" in IEICE TRANSACTIONS on Fundamentals,
vol. E79-A, no. 11, pp. 1886-1893, November 1996, doi: .
Abstract: This paper describes an adaptive neural vector quantization algorithm with a deleting approach of weight (reference) vectors. We call the algorithm an adaptive learning and self-deleting algorithm. At the beginning, we introduce an improved topological neighborhood and an adaptive vector quantization algorithm with little depending on initial values of weight vectors. Then we present the adaptive learning and self-deleting algorithm. The algorithm is represented as the following descriptions: At first, many weight vectors are prepared, and the algorithm is processed with Kohonen's self-organizing feature map. Next, weight vectors are deleted sequentially to the fixed number of them, and the algorithm processed with competitive learning. At the end, we discuss algorithms with neighborhood relations compared with the proposed one. The proposed algorithm is also good in the case of a poor initialization of weight vectors. Experimental results are given to show the effectiveness of the proposed algorithm.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e79-a_11_1886/_p
Copy
@ARTICLE{e79-a_11_1886,
author={Michiharu MAEDA, Hiromi MIYAJIMA, Sadayuki MURASHIMA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={An Adaptive Learning and Self-Deleting Neural Network for Vector Quantization},
year={1996},
volume={E79-A},
number={11},
pages={1886-1893},
abstract={This paper describes an adaptive neural vector quantization algorithm with a deleting approach of weight (reference) vectors. We call the algorithm an adaptive learning and self-deleting algorithm. At the beginning, we introduce an improved topological neighborhood and an adaptive vector quantization algorithm with little depending on initial values of weight vectors. Then we present the adaptive learning and self-deleting algorithm. The algorithm is represented as the following descriptions: At first, many weight vectors are prepared, and the algorithm is processed with Kohonen's self-organizing feature map. Next, weight vectors are deleted sequentially to the fixed number of them, and the algorithm processed with competitive learning. At the end, we discuss algorithms with neighborhood relations compared with the proposed one. The proposed algorithm is also good in the case of a poor initialization of weight vectors. Experimental results are given to show the effectiveness of the proposed algorithm.},
keywords={},
doi={},
ISSN={},
month={November},}
Copy
TY - JOUR
TI - An Adaptive Learning and Self-Deleting Neural Network for Vector Quantization
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1886
EP - 1893
AU - Michiharu MAEDA
AU - Hiromi MIYAJIMA
AU - Sadayuki MURASHIMA
PY - 1996
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E79-A
IS - 11
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - November 1996
AB - This paper describes an adaptive neural vector quantization algorithm with a deleting approach of weight (reference) vectors. We call the algorithm an adaptive learning and self-deleting algorithm. At the beginning, we introduce an improved topological neighborhood and an adaptive vector quantization algorithm with little depending on initial values of weight vectors. Then we present the adaptive learning and self-deleting algorithm. The algorithm is represented as the following descriptions: At first, many weight vectors are prepared, and the algorithm is processed with Kohonen's self-organizing feature map. Next, weight vectors are deleted sequentially to the fixed number of them, and the algorithm processed with competitive learning. At the end, we discuss algorithms with neighborhood relations compared with the proposed one. The proposed algorithm is also good in the case of a poor initialization of weight vectors. Experimental results are given to show the effectiveness of the proposed algorithm.
ER -