• Publications
  • Influence
Exploiting Cross-Sentence Context for Neural Machine Translation
TLDR
We propose a cross-sentence context-aware approach and investigate the influence of historical contextual information on the performance of neural machine translation (NMT). Expand
Long-Distance Dependency Resolution in Automatically Acquired Wide-Coverage PCFG-Based LFG Approximations
TLDR
This paper shows how finite approximations of long distance dependency (LDD) resolution can be obtained automatically for wide-coverage, robust, probabilistic Lexical-Functional Grammar (LFG) resources acquired from treebanks. Expand
Attaining the Unattainable? Reassessing Claims of Human Parity in Neural Machine Translation
TLDR
We reassess a recent study (Hassan et al., 2018) that claimed that machine translation (MT) has reached human parity for the translation of news from Chinese into English, using pairwise ranking and considering three variables that were not taken into account in that previous study: the language in which the source side of the test set was originally written, the translation proficiency of the evaluators, and the provision of inter-sentential context. Expand
Robust language pair-independent sub-tree alignment
TLDR
We propose a novel, language pair-independent algorithm which automatically induces alignments between phrase-structure trees. Expand
Getting Gender Right in Neural Machine Translation
TLDR
We integrate gender information into an NMT system and show that it improves the translation quality for some language pairs. Expand
Parsing with PCFGs and automatic f-structure annotation
TLDR
In this paper we report initial results on a new methodology that attempts to partially automate the development of substantial parts of large coverage, rich unification- (constraint-) based grammar resources. Expand
Investigating Backtranslation in Neural Machine Translation
TLDR
A prerequisite for training corpus-based machine translation (MT) systems -- either Statistical MT (SMT) or Neural MT (NMT) -- is the availability of high-quality parallel data. Expand
Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction
TLDR
We propose a novel multi-level structured (2-D matrix) self-attention mechanism for DS-RE in a multi-instance learning (MIL) framework using bidirectional recurrent neural networks. Expand
Recent Advances in Example-Based Machine Translation
I Foundations of EBMT.- 1 An Overview of EBMT.- 2 What is Example-Based Machine Translation?.- 3 Example-Based Machine Translation in a Controlled Environment.- 4 EBMT Seen as Case-based Reasoning.-Expand
Exploiting source similarity for SMT using context-informed features
TLDR
In this paper, we introduce context informed features in a log-linear phrase-based SMT framework; these features enable us to exploit source similarity in addition to target similarity modeled by the language model. Expand
...
1
2
3
4
5
...