Recurrent Convolutional Neural Networks for Text Classification
@inproceedings{Lai2015RecurrentCN, title={Recurrent Convolutional Neural Networks for Text Classification}, author={Siwei Lai and L. Xu and Kang Liu and Jun Zhao}, booktitle={AAAI}, year={2015} }
Text classification is a foundational task in many NLP applications. [...] Key Method In our model, we apply a recurrent structure to capture contextual information as far as possible when learning word representations, which may introduce considerably less noise compared to traditional window-based neural networks. We also employ a max-pooling layer that automatically judges which words play key roles in text classification to capture the key components in texts. We conduct experiments on four commonly used…Expand Abstract
1,214 Citations
Extensive Pyramid Networks for Text Classification
- Computer Science
- Aust. J. Intell. Inf. Process. Syst.
- 2019
- PDF
Recurrent Graph Neural Networks for Text Classification
- Computer Science
- 2020 IEEE 11th International Conference on Software Engineering and Service Science (ICSESS)
- 2020
- Highly Influenced
Convolutional neural network with contextualized word embedding for text classification
- Computer Science, Engineering
- Other Conferences
- 2019
- 1
Hierarchical Recurrent and Convolutional Neural Network Based on Attention for Chinese Document Classification
- Computer Science
- 2019 Chinese Control And Decision Conference (CCDC)
- 2019
- 1
Cluster-Gated Convolutional Neural Network for Short Text Classification
- Computer Science
- CoNLL
- 2019
- Highly Influenced
- PDF
Text Classification Using a Bidirectional Recurrent Neural Network with an Attention Mechanism
- Computer Science
- 2020 International Conference on Culture-oriented Science & Technology (ICCST)
- 2020
Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model
- Medicine, Computer Science
- Comput. Intell. Neurosci.
- 2019
- 2
Text Classification Using Gated and Transposed Attention Networks
- Computer Science
- 2019 International Joint Conference on Neural Networks (IJCNN)
- 2019
- 1
References
SHOWING 1-10 OF 36 REFERENCES
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
- Computer Science
- EMNLP
- 2011
- 1,150
- Highly Influential
- PDF
Explicit and Implicit Syntactic Features for Text Classification
- Computer Science
- ACL
- 2013
- 51
- Highly Influential
- PDF
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
- Computer Science
- EMNLP
- 2013
- 4,172
- Highly Influential
- PDF
Natural Language Processing (Almost) from Scratch
- Computer Science
- J. Mach. Learn. Res.
- 2011
- 5,849
- Highly Influential
- PDF
Word Representations: A Simple and General Method for Semi-Supervised Learning
- Computer Science
- ACL
- 2010
- 2,008
- PDF
Recurrent Convolutional Neural Networks for Discourse Compositionality
- Computer Science
- CVSM@ACL
- 2013
- 214
- PDF
Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection
- Computer Science
- NIPS
- 2011
- 820
- Highly Influential
- PDF