A Hierarchy-to-Sequence Attentional Neural Machine Translation Model

Abstract

Although sequence-to-sequence attentional neural machine translation NMT has achieved great progress recently, it is confronted with two challenges: learning optimal model parameters for long parallel sentences and well exploiting different scopes of contexts. In this paper, partially inspired by the idea of segmenting a long sentence into short clauses… (More)
DOI: 10.1109/TASLP.2018.2789721

Topics

15 Figures and Tables

Slides referencing similar topics