The search functionality is under construction.

IEICE TRANSACTIONS on Fundamentals

Efficient Mini-Batch Training on Memristor Neural Network Integrating Gradient Calculation and Weight Update

Satoshi YAMAMORI, Masayuki HIROMOTO, Takashi SATO

  • Full Text Views

    0

  • Cite this

Summary :

We propose an efficient training method for memristor neural networks. The proposed method is suitable for the mini-batch-based training, which is a common technique for various neural networks. By integrating the two processes of gradient calculation in the backpropagation algorithm and weight update in the write operation to the memristors, the proposed method accelerates the training process and also eliminates the external computing resources required in the existing method, such as multipliers and memories. Through numerical experiments, we demonstrated that the proposed method achieves twice faster convergence of the training process than the existing method, while retaining the same level of the accuracy for the classification results.

Publication
IEICE TRANSACTIONS on Fundamentals Vol.E101-A No.7 pp.1092-1100
Publication Date
2018/07/01
Publicized
Online ISSN
1745-1337
DOI
10.1587/transfun.E101.A.1092
Type of Manuscript
PAPER
Category
Neural Networks and Bioengineering

Authors

Satoshi YAMAMORI
  Kyoto University
Masayuki HIROMOTO
  Kyoto University
Takashi SATO
  Kyoto University

Keyword