Effective Quantization Approaches for Recurrent Neural Networks.

RSS Source
Authors
Md Zahangir Alom, Adam T Moody, Naoya Maruyama, Brian C Van Essen, Tarek M. Taha

Deep learning, and in particular Recurrent Neural Networks (RNN) have shownsuperior accuracy in a large variety of tasks including machine translation,language understanding, and movie frame generation. However, these deeplearning approaches are very expensive in terms of computation. In most cases,Graphic Processing Units (GPUs) are in used for large scale implementations.Meanwhile, energy efficient RNN approaches are proposed for deploying solutionson special purpose hardware including Field Programming Gate Arrays (FPGAs) andmobile platforms. In this paper, we propose an effective quantization approachfor Recurrent Neural Networks (RNN) techniques including Long Short Term Memory(LSTM), Gated Recurrent Units (GRU), and Convolutional Long Short Term Memory(ConvLSTM). We have implemented different quantization methods including BinaryConnect {-1, 1}, Ternary Connect {-1, 0, 1}, and Quaternary Connect {-1, -0.5,0.5, 1}. These proposed approaches are evaluated on different datasets forsentiment analysis on IMDB and video frame predictions on the moving MNISTdataset. The experimental results are compared against the full precisionversions of the LSTM, GRU, and ConvLSTM. They show promising results for bothsentiment analysis and video frame prediction.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.