The search functionality is under construction.
The search functionality is under construction.

Low-Complexity Training for Binary Convolutional Neural Networks Based on Clipping-Aware Weight Update

Changho RYU, Tae-Hwan KIM

  • Full Text Views

    0

  • Cite this

Summary :

This letter presents an efficient technique to reduce the computational complexity involved in training binary convolutional neural networks (BCNN). The BCNN training shall be conducted focusing on the optimization of the sign of each weight element rather than the exact value itself in convention; in which, the sign of an element is not likely to be flipped anymore after it has been updated to have such a large magnitude to be clipped out. The proposed technique does not update such elements that have been clipped out and eliminates the computations involved in their optimization accordingly. The complexity reduction by the proposed technique is as high as 25.52% in training the BCNN model for the CIFAR-10 classification task, while the accuracy is maintained without severe degradation.

Publication
IEICE TRANSACTIONS on Information Vol.E104-D No.6 pp.919-922
Publication Date
2021/06/01
Publicized
2021/03/17
Online ISSN
1745-1361
DOI
10.1587/transinf.2020EDL8143
Type of Manuscript
LETTER
Category
Biocybernetics, Neurocomputing

Authors

Changho RYU
  Korea Aerospace University
Tae-Hwan KIM
  Korea Aerospace University

Keyword