Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Takeshi KAMIO, Hisato FUJISAKA, Mititada MORISUE, "Backpropagation Algorithm for LOGic Oriented Neural Networks with Quantized Weights and Multilevel Threshold Neurons" in IEICE TRANSACTIONS on Fundamentals,
vol. E84-A, no. 3, pp. 705-712, March 2001, doi: .
Abstract: Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e84-a_3_705/_p
Copy
@ARTICLE{e84-a_3_705,
author={Takeshi KAMIO, Hisato FUJISAKA, Mititada MORISUE, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Backpropagation Algorithm for LOGic Oriented Neural Networks with Quantized Weights and Multilevel Threshold Neurons},
year={2001},
volume={E84-A},
number={3},
pages={705-712},
abstract={Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.},
keywords={},
doi={},
ISSN={},
month={March},}
Copy
TY - JOUR
TI - Backpropagation Algorithm for LOGic Oriented Neural Networks with Quantized Weights and Multilevel Threshold Neurons
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 705
EP - 712
AU - Takeshi KAMIO
AU - Hisato FUJISAKA
AU - Mititada MORISUE
PY - 2001
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E84-A
IS - 3
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - March 2001
AB - Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.
ER -