This letter presents an efficient technique to reduce the computational complexity involved in training binary convolutional neural networks (BCNN). The BCNN training shall be conducted focusing on the optimization of the sign of each weight element rather than the exact value itself in convention; in which, the sign of an element is not likely to be flipped anymore after it has been updated to have such a large magnitude to be clipped out. The proposed technique does not update such elements that have been clipped out and eliminates the computations involved in their optimization accordingly. The complexity reduction by the proposed technique is as high as 25.52% in training the BCNN model for the CIFAR-10 classification task, while the accuracy is maintained without severe degradation.
Changho RYU
Korea Aerospace University
Tae-Hwan KIM
Korea Aerospace University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Changho RYU, Tae-Hwan KIM, "Low-Complexity Training for Binary Convolutional Neural Networks Based on Clipping-Aware Weight Update" in IEICE TRANSACTIONS on Information,
vol. E104-D, no. 6, pp. 919-922, June 2021, doi: 10.1587/transinf.2020EDL8143.
Abstract: This letter presents an efficient technique to reduce the computational complexity involved in training binary convolutional neural networks (BCNN). The BCNN training shall be conducted focusing on the optimization of the sign of each weight element rather than the exact value itself in convention; in which, the sign of an element is not likely to be flipped anymore after it has been updated to have such a large magnitude to be clipped out. The proposed technique does not update such elements that have been clipped out and eliminates the computations involved in their optimization accordingly. The complexity reduction by the proposed technique is as high as 25.52% in training the BCNN model for the CIFAR-10 classification task, while the accuracy is maintained without severe degradation.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2020EDL8143/_p
Copy
@ARTICLE{e104-d_6_919,
author={Changho RYU, Tae-Hwan KIM, },
journal={IEICE TRANSACTIONS on Information},
title={Low-Complexity Training for Binary Convolutional Neural Networks Based on Clipping-Aware Weight Update},
year={2021},
volume={E104-D},
number={6},
pages={919-922},
abstract={This letter presents an efficient technique to reduce the computational complexity involved in training binary convolutional neural networks (BCNN). The BCNN training shall be conducted focusing on the optimization of the sign of each weight element rather than the exact value itself in convention; in which, the sign of an element is not likely to be flipped anymore after it has been updated to have such a large magnitude to be clipped out. The proposed technique does not update such elements that have been clipped out and eliminates the computations involved in their optimization accordingly. The complexity reduction by the proposed technique is as high as 25.52% in training the BCNN model for the CIFAR-10 classification task, while the accuracy is maintained without severe degradation.},
keywords={},
doi={10.1587/transinf.2020EDL8143},
ISSN={1745-1361},
month={June},}
Copy
TY - JOUR
TI - Low-Complexity Training for Binary Convolutional Neural Networks Based on Clipping-Aware Weight Update
T2 - IEICE TRANSACTIONS on Information
SP - 919
EP - 922
AU - Changho RYU
AU - Tae-Hwan KIM
PY - 2021
DO - 10.1587/transinf.2020EDL8143
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E104-D
IS - 6
JA - IEICE TRANSACTIONS on Information
Y1 - June 2021
AB - This letter presents an efficient technique to reduce the computational complexity involved in training binary convolutional neural networks (BCNN). The BCNN training shall be conducted focusing on the optimization of the sign of each weight element rather than the exact value itself in convention; in which, the sign of an element is not likely to be flipped anymore after it has been updated to have such a large magnitude to be clipped out. The proposed technique does not update such elements that have been clipped out and eliminates the computations involved in their optimization accordingly. The complexity reduction by the proposed technique is as high as 25.52% in training the BCNN model for the CIFAR-10 classification task, while the accuracy is maintained without severe degradation.
ER -