Corpus ID: 11270374

Text Classification Improved by Integrating Bidirectional LSTM with Two-dimensional Max Pooling

@article{Zhou2016TextCI,
  title={Text Classification Improved by Integrating Bidirectional LSTM with Two-dimensional Max Pooling},
  author={Peng Zhou and Z. Qi and Suncong Zheng and Jiaming Xu and Hongyun Bao and Bo Xu},
  journal={ArXiv},
  year={2016},
  volume={abs/1611.06639}
}
  • Peng Zhou, Z. Qi, +3 authors Bo Xu
  • Published 2016
  • Computer Science
  • ArXiv
  • Recurrent Neural Network (RNN) is one of the most popular architectures used in Natural Language Processsing (NLP) tasks because its recurrent structure is very suitable to process variablelength text. RNN can utilize distributed representations of words by first converting the tokens comprising each text into vectors, which form a matrix. And this matrix includes two dimensions: the time-step dimension and the feature vector dimension. Then most existing models usually utilize one-dimensional… CONTINUE READING
    242 Citations
    Integrating Bidirectional LSTM with Inception for Text Classification
    • Wei Jiang, Z. Jin
    • Computer Science
    • 2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)
    • 2017
    • 1
    • Highly Influenced
    A novel method of text representation on hybrid neural networks
    • Yanbu Guo, C. Jin, W. Li, C. Ji, Yuan Fang, Yunhao Duan
    • Computer Science
    • 2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)
    • 2017
    Stacked Residual Recurrent Neural Networks With Cross-Layer Attention for Text Classification
    • 1
    • Highly Influenced
    • PDF
    Text Classification Using Gated and Transposed Attention Networks
    • K. He, M. Zhu
    • Computer Science
    • 2019 International Joint Conference on Neural Networks (IJCNN)
    • 2019
    • 1
    Study on Impact of RNN, CNN and HAN in Text Classification
    Length Adaptive Recurrent Model for Text Classification
    • 6
    • PDF
    Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model
    • 2
    • PDF
    Densely Connected Bidirectional LSTM with Applications to Sentence Classification
    • 27
    • PDF

    References

    SHOWING 1-10 OF 51 REFERENCES
    Recurrent Convolutional Neural Networks for Text Classification
    • 1,131
    • Highly Influential
    • PDF
    A C-LSTM Neural Network for Text Classification
    • 420
    • PDF
    Relation Classification via Convolutional Deep Neural Network
    • 982
    • Highly Influential
    • PDF
    Learning text representation using recurrent convolutional neural network with highway layers
    • 33
    • PDF
    A Convolutional Neural Network for Modelling Sentences
    • 2,441
    • Highly Influential
    • PDF
    Distributed Representations of Sentences and Documents
    • 5,416
    • PDF
    Glove: Global Vectors for Word Representation
    • 15,446
    • PDF
    Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and Documents
    • 95
    • PDF