It is a hot issue that speeding up the network layers and decreasing the network parameters in convolutional neural networks (CNNs). In this paper, we propose a novel method, namely, symmetric decomposition of convolution kernels (SDKs). It symmetrically separates k×k convolution kernels into (k×1 and 1×k) or (1×k and k×1) kernels. We conduct the comparison experiments of the network models designed by SDKs on MNIST and CIFAR-10 datasets. Compared with the corresponding CNNs, we obtain good recognition performance, with 1.1×-1.5× speedup and more than 30% reduction of network parameters. The experimental results indicate our method is useful and effective for CNNs in practice, in terms of speedup performance and reduction of parameters.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Jun OU, Yujian LI, "Symmetric Decomposition of Convolution Kernels" in IEICE TRANSACTIONS on Information,
vol. E102-D, no. 1, pp. 219-222, January 2019, doi: 10.1587/transinf.2018EDL8136.
Abstract: It is a hot issue that speeding up the network layers and decreasing the network parameters in convolutional neural networks (CNNs). In this paper, we propose a novel method, namely, symmetric decomposition of convolution kernels (SDKs). It symmetrically separates k×k convolution kernels into (k×1 and 1×k) or (1×k and k×1) kernels. We conduct the comparison experiments of the network models designed by SDKs on MNIST and CIFAR-10 datasets. Compared with the corresponding CNNs, we obtain good recognition performance, with 1.1×-1.5× speedup and more than 30% reduction of network parameters. The experimental results indicate our method is useful and effective for CNNs in practice, in terms of speedup performance and reduction of parameters.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2018EDL8136/_p
Copy
@ARTICLE{e102-d_1_219,
author={Jun OU, Yujian LI, },
journal={IEICE TRANSACTIONS on Information},
title={Symmetric Decomposition of Convolution Kernels},
year={2019},
volume={E102-D},
number={1},
pages={219-222},
abstract={It is a hot issue that speeding up the network layers and decreasing the network parameters in convolutional neural networks (CNNs). In this paper, we propose a novel method, namely, symmetric decomposition of convolution kernels (SDKs). It symmetrically separates k×k convolution kernels into (k×1 and 1×k) or (1×k and k×1) kernels. We conduct the comparison experiments of the network models designed by SDKs on MNIST and CIFAR-10 datasets. Compared with the corresponding CNNs, we obtain good recognition performance, with 1.1×-1.5× speedup and more than 30% reduction of network parameters. The experimental results indicate our method is useful and effective for CNNs in practice, in terms of speedup performance and reduction of parameters.},
keywords={},
doi={10.1587/transinf.2018EDL8136},
ISSN={1745-1361},
month={January},}
Copy
TY - JOUR
TI - Symmetric Decomposition of Convolution Kernels
T2 - IEICE TRANSACTIONS on Information
SP - 219
EP - 222
AU - Jun OU
AU - Yujian LI
PY - 2019
DO - 10.1587/transinf.2018EDL8136
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E102-D
IS - 1
JA - IEICE TRANSACTIONS on Information
Y1 - January 2019
AB - It is a hot issue that speeding up the network layers and decreasing the network parameters in convolutional neural networks (CNNs). In this paper, we propose a novel method, namely, symmetric decomposition of convolution kernels (SDKs). It symmetrically separates k×k convolution kernels into (k×1 and 1×k) or (1×k and k×1) kernels. We conduct the comparison experiments of the network models designed by SDKs on MNIST and CIFAR-10 datasets. Compared with the corresponding CNNs, we obtain good recognition performance, with 1.1×-1.5× speedup and more than 30% reduction of network parameters. The experimental results indicate our method is useful and effective for CNNs in practice, in terms of speedup performance and reduction of parameters.
ER -