Exploiting Cross-Sentence Context for Neural Machine Translation

@inproceedings{Wang2017ExploitingCC,
  title={Exploiting Cross-Sentence Context for Neural Machine Translation},
  author={Longyue Wang and Zhaopeng Tu and A. Way and Qun Liu},
  booktitle={EMNLP},
  year={2017}
}
In translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a cross-sentence context-aware approach and investigate the influence of historical contextual information on the performance of neural machine translation (NMT). First, this history is summarized in a hierarchical way. We then integrate the historical representation into NMT in two strategies: 1) a warm-start of encoder and decoder states, and 2) an auxiliary… Expand
Using Whole Document Context in Neural Machine Translation
Context-aware Neural Machine Translation with Coreference Information
Context-Interactive Pre-Training for Document Machine Translation
Reference Network for Neural Machine Translation
Diverse Pretrained Context Encodings Improve Document Translation
Document Graph for Neural Machine Translation
Context-Aware Learning for Neural Machine Translation
Improving the Transformer Translation Model with Document-Level Context
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 24 REFERENCES
Modeling Coverage for Neural Machine Translation
Neural Machine Translation with Reconstruction
Recurrent Continuous Translation Models
Effective Approaches to Attention-based Neural Machine Translation
Context Gates for Neural Machine Translation
Multi-Task Learning for Multiple Language Translation
Neural Machine Translation by Jointly Learning to Align and Translate
Context-dependent word representation for neural machine translation
Incorporating Global Visual Features into Attention-based Neural Machine Translation
...
1
2
3
...