Corpus ID: 237581301

One Source, Two Targets: Challenges and Rewards of Dual Decoding

@inproceedings{Xu2021OneST,
  title={One Source, Two Targets: Challenges and Rewards of Dual Decoding},
  author={Jitao Xu and Franccois Yvon},
  booktitle={EMNLP},
  year={2021}
}
Machine translation is generally understood as generating one target text from an input source document. In this paper, we consider a stronger requirement: to jointly generate two texts so that each output side effectively depends on the other. As we discuss, such a device serves several practical purposes, from multi-target machine translation to the generation of controlled variations of the target text. We present an analysis of possible implementations of dual decoding, and experiment with… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 55 REFERENCES
Code-Switching for Enhancing NMT with Pre-Specified Translation
TLDR
This work investigates a data augmentation method, making code-switched training data by replacing source phrases with their target translations, allowing the model to learn lexicon translations by copying source-side target words. Expand
Synchronous Interactive Decoding for Multilingual Neural Machine Translation
TLDR
This paper proposes a synchronous cross-interactive decoder in which generation of each target output does not only depend on its generated sequences, but also relies on its future information, as well as history and future contexts of other target languages. Expand
Meet Changes with Constancy: Learning Invariance in Multi-Source Translation
TLDR
This paper proposes a source invariance network to learn the invariant information of parallel sources and demonstrates that the proposed approach not only achieves clear gains in translation quality but also captures implicit invariance between different sources. Expand
Multi-Task Learning for Multiple Language Translation
TLDR
The recently proposed neural machine translation model is extended to a multi-task learning framework which shares source language representation and separates the modeling of different target language translation. Expand
Synchronously Generating Two Languages with Interactive Decoding
TLDR
A novel interactive approach to translate a source language into two different languages simultaneously and interactively and can obtain significant improvements over both conventional Neural Machine Translation (NMT) model and multilingual NMT model. Expand
Between Flexibility and Consistency: Joint Generation of Captions and Subtitles
TLDR
The findings show that joint decoding leads to increased performance and consistency between the generated captions and subtitles while still allowing for sufficient flexibility to produce subtitles conforming to language-specific needs and norms. Expand
Toward Multilingual Neural Machine Translation with Universal Encoder and Decoder
In this paper, we present our first attempts in building a multilingual Neural Machine Translation framework under a unified approach. We are then able to employ attention-based NMT for many-to-manyExpand
Three Strategies to Improve One-to-Many Multilingual Translation
TLDR
This work introduces three strategies to improve one-to-many multilingual translation by balancing the shared and unique features and proposes to divide the hidden cells of the decoder into shared and language-dependent ones. Expand
Can You Traducir This? Machine Translation for Code-Switched Input
TLDR
Experiments show this training strategy yields MT systems that surpass multilingual systems for code-switched texts, and these results are confirmed in an alternative task aimed at providing contextual translations for a L2 writing assistant. Expand
Tied Multitask Learning for Neural Speech Translation
TLDR
This work introduces a model where the second task decoder receives information from the decoder of the first task, since higher-level intermediate representations should provide useful information and applies regularization that encourages transitivity and invertibility. Expand
...
1
2
3
4
5
...