Corpus ID: 17352509

BLEU Evaluation of Machine-Translated English-Croatian Legislation

  title={BLEU Evaluation of Machine-Translated English-Croatian Legislation},
  author={S. Seljan and Marija Brkic Bakaric and Tomislav Vicic},
  • S. Seljan, Marija Brkic Bakaric, Tomislav Vicic
  • Published in LREC 2012
  • Computer Science
  • This paper presents work on the evaluation of online available machine translation (MT) service, i.e. Google Translate, for English- Croatian language pair in the domain of legislation. The total set of 200 sentences, for which three reference translations are provided, is divided into short and long sentences. Human evaluation is performed by native speakers, using the criteria of adequacy and fluency. For measuring the reliability of agreement among raters, Fleiss' kappa metric is used. Human… CONTINUE READING
    11 Citations

    Figures, Tables, and Topics from this paper

    Automatic quality evaluation of machine-translated output in sociological-philosophical-spiritual domain
    • S. Seljan, I. Dunder
    • Computer Science
    • 2015 10th Iberian Conference on Information Systems and Technologies (CISTI)
    • 2015
    • 4
    Empirical survey of machine translation tools
    • S. Chand
    • Computer Science
    • 2016 Second International Conference on Research in Computational Intelligence and Communication Networks (ICRCICN)
    • 2016
    • 16
    A review of Thai–English machine translation
    • S. Lyons
    • Computer Science
    • Machine Translation
    • 2020
    Language Modeling for Journalistic Robot based on Generative Pretrained Transformer 2
    Machine Translation System for the Industry Domain and Croatian Language
    • PDF
    Generating Question Titles for Stack Overflow from Mined Code Snippets
    • 2
    • PDF


    Comparative Evaluation of Online Machine Translation Systems with Legal Texts
    • 31
    Performance of an online translation tool when applied to patient educational material.
    • 19
    • PDF
    Taking on new challenges in multi-word unit processing for machine translation
    • 14
    • PDF
    Bleu: a Method for Automatic Evaluation of Machine Translation
    • 13,005
    • PDF
    Re-evaluation the Role of Bleu in Machine Translation Research
    • 639
    • Highly Influential
    • PDF
    Meteor, M-BLEU and M-TER: Evaluation Metrics for High-Correlation with Human Rankings of Machine Translation Output
    • 104
    • PDF
    Decomposability of Translation Metrics for Improved Evaluation and Efficient Algorithms
    • 54
    • PDF