Constant-Time Machine Translation with Conditional Masked Language Models

@article{Ghazvininejad2019ConstantTimeMT,
  title={Constant-Time Machine Translation with Conditional Masked Language Models},
  author={Marjan Ghazvininejad and Omer Levy and Yinhan Liu and Luke S. Zettlemoyer},
  journal={CoRR},
  year={2019},
  volume={abs/1904.09324}
}
Most machine translation systems generate text autoregressively, by sequentially predicting tokens from left to right. We, instead, use a masked language modeling objective to train a model to predict any subset of the target words, conditioned on both the input text and a partially masked target translation. This approach allows for efficient iterative decoding, where we first predict all of the target words non-autoregressively, and then repeatedly mask out and regenerate the subset of words… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-4 OF 4 CITATIONS

A Generalized Framework of Sequence Generation with Application to Undirected Sequence Models

Elman Mansimov, Alex Wang, Kyunghyun Cho
  • ArXiv
  • 2019
VIEW 2 EXCERPTS
CITES BACKGROUND

Defending Against Neural Fake News

  • ArXiv
  • 2019
VIEW 1 EXCERPT
CITES BACKGROUND

Real or Fake? Learning to Discriminate Machine from Human Generated Text

Anton Bakhtin, Sam Gross, +3 authors Arthur Szlam
  • ArXiv
  • 2019
VIEW 1 EXCERPT
CITES BACKGROUND

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…