Neural Speed Reading via Skim-RNN

@article{Seo2017NeuralSR,
  title={Neural Speed Reading via Skim-RNN},
  author={Min Joon Seo and Sewon Min and Ali Farhadi and Hannaneh Hajishirzi},
  journal={CoRR},
  year={2017},
  volume={abs/1711.02085}
}
Inspired by the principles of speed reading, we introduce Skim-RNN, a recurrent neural network (RNN) that dynamically decides to update only a small fraction of the hidden state for relatively unimportant input tokens. Skim-RNN gives computational advantage over an RNN that always updates the entire hidden state. Skim-RNN uses the same input and output interfaces as a standard RNN and can be easily used instead of RNNs in existing models. In our experiments, we show that Skim-RNN can achieve… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 34 references

Similar Papers

Loading similar papers…