Hierarchical Back-off Modeling of Hiero Grammar based on Non-parametric Bayesian Model


In hierarchical phrase-based machine translation, a rule table is automatically learned by heuristically extracting synchronous rules from a parallel corpus. As a result, spuriously many rules are extracted which may be composed of various incorrect rules. The larger rule table incurs more run time for decoding and may result in lower translation quality. To resolve the problems, we propose a hierarchical back-off model for Hiero grammar, an instance of a synchronous context free grammar (SCFG), on the basis of the hierarchical Pitman-Yor process. The model can extract a compact rule and phrase table without resorting to any heuristics by hierarchically backing off to smaller phrases under SCFG. Inference is efficiently carried out using two-step synchronous parsing of Xiao et al., (2012) combined with slice sampling. In our experiments, the proposed model achieved higher or at least comparable translation quality against a previous Bayesian model on various language pairs; German/French/Spanish/JapaneseEnglish. When compared against heuristic models, our model achieved comparable translation quality on a full size GermanEnglish language pair in Europarl v7 corpus with significantly smaller grammar size; less than 10% of that for heuristic model.

Extracted Key Phrases

7 Figures and Tables

Cite this paper

@inproceedings{Kamigaito2015HierarchicalBM, title={Hierarchical Back-off Modeling of Hiero Grammar based on Non-parametric Bayesian Model}, author={Hidetaka Kamigaito and Taro Watanabe and Hiroya Takamura and Manabu Okumura and Eiichiro Sumita}, booktitle={EMNLP}, year={2015} }