Corpus ID: 218581522

Compressing Recurrent Neural Networks Using Hierarchical Tucker Tensor Decomposition

  title={Compressing Recurrent Neural Networks Using Hierarchical Tucker Tensor Decomposition},
  author={Miao Yin and Siyu Liao and Xiao-Yang Liu and X. Wang and Bo Yuan},
  • Miao Yin, Siyu Liao, +2 authors Bo Yuan
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • Recurrent Neural Networks (RNNs) have been widely used in sequence analysis and modeling. However, when processing high-dimensional data, RNNs typically require very large model sizes, thereby bringing a series of deployment challenges. Although the state-of-the-art tensor decomposition approaches can provide good model compression performance, these existing methods are still suffering some inherent limitations, such as restricted representation capability and insufficient model complexity… CONTINUE READING
    3 Citations
    Kronecker CP Decomposition with Fast Multiplication for Compressing RNNs
    • Highly Influenced
    • PDF
    A Fully Tensorized Recurrent Neural Network
    • PDF
    A Variational Information Bottleneck Based Method to Compress Sequential Networks for Human Action Recognition
    • PDF


    Learning Compact Recurrent Neural Networks with Block-Term Tensor Decomposition
    • 44
    • PDF
    Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition
    • 23
    • PDF
    Long-term recurrent convolutional networks for visual recognition and description
    • 3,460
    • PDF
    Two Stream LSTM: A Deep Fusion Framework for Human Action Recognition
    • 89
    • PDF
    Video Paragraph Captioning Using Hierarchical Recurrent Neural Networks
    • 406
    • PDF
    Convolutional Two-Stream Network Fusion for Video Action Recognition
    • 1,506
    • PDF