Insertion-based Decoding with Automatically Inferred Generation Order

@article{Gu2019InsertionbasedDW,
  title={Insertion-based Decoding with Automatically Inferred Generation Order},
  author={Jiatao Gu and Qi Liu and Kyunghyun Cho},
  journal={Transactions of the Association for Computational Linguistics},
  year={2019},
  volume={7},
  pages={661-676}
}
  • Jiatao Gu, Qi Liu, Kyunghyun Cho
  • Published 2019
  • Computer Science
  • Transactions of the Association for Computational Linguistics
  • Conventional neural autoregressive decoding commonly assumes a fixed left-to-right generation order, which may be sub-optimal. In this work, we propose a novel decoding algorithm— InDIGO—which supports flexible sequence generation in arbitrary orders through insertion operations. We extend Transformer, a state-of-the-art sequence generation model, to efficiently implement the proposed approach, enabling it to be trained with either a pre-defined generation order or adaptive orders obtained from… CONTINUE READING
    49 Citations
    Levenshtein Transformer
    • 63
    • PDF
    POINTER: Constrained Progressive Text Generation via Insertion-based Generative Pre-training
    • 1
    • Highly Influenced
    • PDF
    Cascaded Text Generation with Markov Transformers
    • 1
    • PDF
    POINTER: Constrained Text Generation via Insertion-based Generative Pre-training
    • 9
    • Highly Influenced
    • PDF
    Insertion Transformer: Flexible Sequence Generation via Insertion Operations
    • 84
    • Highly Influenced
    • PDF
    Sequence Modeling with Unconstrained Generation Order
    • 6
    • Highly Influenced
    • PDF
    Fast Interleaved Bidirectional Sequence Generation
    A Generalized Framework of Sequence Generation with Application to Undirected Sequence Models
    • 21
    • PDF
    An Empirical Study of Generation Order for Machine Translation
    • 5
    • Highly Influenced
    • PDF

    References

    SHOWING 1-10 OF 61 REFERENCES
    Non-Monotonic Sequential Text Generation
    • 50
    • PDF
    Insertion Transformer: Flexible Sequence Generation via Insertion Operations
    • 84
    • PDF
    Sequence Generation: From Both Sides to the Middle
    • 8
    • PDF
    Middle-Out Decoding
    • 10
    • PDF
    Noisy Parallel Approximate Decoding for Conditional Recurrent Language Model
    • 35
    • PDF
    Blockwise Parallel Decoding for Deep Autoregressive Models
    • 29
    • PDF
    Synchronous Bidirectional Neural Machine Translation
    • 28
    • PDF
    Semi-Autoregressive Neural Machine Translation
    • 24
    • PDF
    The Importance of Generation Order in Language Modeling
    • 19
    • Highly Influential
    • PDF
    Incorporating Copying Mechanism in Sequence-to-Sequence Learning
    • 867
    • PDF