• Publications
  • Influence
Exploiting Cross-Sentence Context for Neural Machine Translation
TLDR
We propose a cross-sentence context-aware approach and investigate the influence of historical contextual information on the performance of neural machine translation (NMT). Expand
Exploiting Sentential Context for Neural Machine Translation
TLDR
In this work, we present novel approaches to exploit sentential context for neural machine translation (NMT). Expand
A Systematic Comparison of Data Selection Criteria for SMT Domain Adaptation
TLDR
Data selection has shown significant improvements in effective use of training data by extracting sentences from large general-domain corpora to adapt statistical machine translation (SMT) systems to in-domain data. Expand
Self-Attention with Structural Position Representations
TLDR
We propose to augment SANs with structural position representations to model the latent structure of the input sentence, which is complementary to the standard sequential positional representations. Expand
Convolutional Self-Attention Networks
TLDR
In this work, we propose novel convolutional self-attention networks, which offer SANs the abilities to strengthen dependencies among neighboring elements, and 2) model the interaction between features extracted by multiple attention heads. Expand
The DCU Discourse Parser for Connective, Argument Identification and Explicit Sense Classification
TLDR
This paper describes our submission to the CoNLL-2015 shared task on discourse parsing. Expand
Dynamic Layer Aggregation for Neural Machine Translation with Routing-by-Agreement
TLDR
We propose to use routing-by-agreement strategies to aggregate layers dynamically for deep NMT models. Expand
A novel and robust approach for pro-drop language translation
TLDR
We propose a semi-supervised approach with a universal framework to recall missing pronouns in translation. Expand
Translating Pro-Drop Languages with Reconstruction Models
TLDR
Pronouns are frequently omitted in pro-drop languages, such as Chinese, generally leading to significant challenges with respect to the production of complete translations. Expand
Assessing the Ability of Self-Attention Networks to Learn Word Order
TLDR
Self-attention networks (SAN) have attracted a lot of interests due to their high parallelization and strong performance on a variety of NLP tasks, e.g. machine translation. Expand
...
1
2
3
4
5
...