Learning to Skim Text

@inproceedings{Yu2017LearningTS,
  title={Learning to Skim Text},
  author={Adams Wei Yu and Hongrae Lee and Quoc V. Le},
  booktitle={ACL},
  year={2017}
}
Recurrent Neural Networks are showing much promise in many sub-areas of natural language processing, ranging from document classification to machine translation to automatic question answering. [...] Key Method The underlying model is a recurrent network that learns how far to jump after reading a few words of the input text. We employ a standard policy gradient method to train the model to make discrete jumping decisions. In our benchmarks on four different tasks, including number prediction, sentiment…Expand
Fast and Accurate Text Classification: Skimming, Rereading and Early Stopping
Leap-LSTM: Enhancing Long Short-Term Memory for Text Categorization
Length Adaptive Recurrent Model for Text Classification
Recurrent Chunking Mechanisms for Long-Text Machine Reading Comprehension
Learning to Search in Long Documents Using Document Structure
Neural Speed Reading with Structural-Jump-LSTM
Pointing to Select: A Fast Pointer-LSTM for Long Text Classification
Neural Speed Reading Audited
QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 54 REFERENCES
Sequence Level Training with Recurrent Neural Networks
A Neural Conversational Model
Learning Recurrent Span Representations for Extractive Question Answering
Hierarchical Question Answering for Long Documents
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
Machine Comprehension Using Match-LSTM and Answer Pointer
A Parallel-Hierarchical Model for Machine Comprehension on Sparse Data
Neural Responding Machine for Short-Text Conversation
Distributed Representations of Sentences and Documents
...
1
2
3
4
5
...