The search functionality is under construction.

IEICE TRANSACTIONS on Information

Speeding up Deep Neural Networks in Speech Recognition with Piecewise Quantized Sigmoidal Activation Function

Anhao XING, Qingwei ZHAO, Yonghong YAN

  • Full Text Views

    0

  • Cite this

Summary :

This paper proposes a new quantization framework on activation function of deep neural networks (DNN). We implement fixed-point DNN by quantizing the activations into powers-of-two integers. The costly multiplication operations in using DNN can be replaced with low-cost bit-shifts to massively save computations. Thus, applying DNN-based speech recognition on embedded systems becomes much easier. Experiments show that the proposed method leads to no performance degradation.

Publication
IEICE TRANSACTIONS on Information Vol.E99-D No.10 pp.2558-2561
Publication Date
2016/10/01
Publicized
2016/07/19
Online ISSN
1745-1361
DOI
10.1587/transinf.2016SLL0007
Type of Manuscript
Special Section LETTER (Special Section on Recent Advances in Machine Learning for Spoken Language Processing)
Category
Acoustic modeling

Authors

Anhao XING
  Chinese Academy of Sciences
Qingwei ZHAO
  Chinese Academy of Sciences
Yonghong YAN
  Chinese Academy of Sciences,Xinjiang Laboratory of Minority Speech and Language Information Processing

Keyword