Corpus ID: 18233504

Generating Text with Deep Reinforcement Learning

@article{Guo2015GeneratingTW,
  title={Generating Text with Deep Reinforcement Learning},
  author={Hongyu Guo},
  journal={ArXiv},
  year={2015},
  volume={abs/1510.09202}
}
  • Hongyu Guo
  • Published 2015
  • Computer Science
  • ArXiv
  • We introduce a novel schema for sequence to sequence learning with a Deep Q-Network (DQN), which decodes the output sequence iteratively. [...] Key Method This list can contain ranked potential words. Next, the DQN learns to make decision on which action (e.g., word) will be selected from the list to modify the current decoded sequence. The newly modified output sequence is subsequently used as the input to the DQN for the next decoding iteration. In each iteration, we also bias the reinforcement learning's…Expand Abstract

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 25 CITATIONS, ESTIMATED 87% COVERAGE

    Sentence-Level Semantic Features Guided Adversarial Network for Zhuang Language Part-of-Speech Tagging

    VIEW 1 EXCERPT
    CITES METHODS

    Relation Extraction with Deep Reinforcement Learning

    VIEW 1 EXCERPT
    CITES BACKGROUND

    A Generative Model for category text generation

    VIEW 1 EXCERPT

    Learning How to Self-Learn: Enhancing Self-Training Using Neural Reinforcement Learning

    • Chenhua Chen, Yue Zhang
    • Computer Science
    • 2018 International Conference on Asian Language Processing (IALP)
    • 2018

    FILTER CITATIONS BY YEAR

    2017
    2020

    CITATION STATISTICS

    • 2 Highly Influenced Citations

    • Averaged 7 Citations per year from 2018 through 2020

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 61 REFERENCES

    Show and tell: A neural image caption generator

    VIEW 1 EXCERPT

    Speech recognition with deep recurrent neural networks

    VIEW 1 EXCERPT

    Weakly Supervised Learning of Semantic Parsers for Mapping Instructions to Actions