The search functionality is under construction.
The search functionality is under construction.

Backpropagation Algorithm for LOGic Oriented Neural Networks with Quantized Weights and Multilevel Threshold Neurons

Takeshi KAMIO, Hisato FUJISAKA, Mititada MORISUE

  • Full Text Views

    0

  • Cite this

Summary :

Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.

Publication
IEICE TRANSACTIONS on Fundamentals Vol.E84-A No.3 pp.705-712
Publication Date
2001/03/01
Publicized
Online ISSN
DOI
Type of Manuscript
Special Section PAPER (Special Section of Selected Papers from the 13th Workshop on Circuits and Systems in Karuizawa)
Category

Authors

Keyword