• Corpus ID: 15890012

Translation Quality and Effort: Options versus Post-editing

  title={Translation Quality and Effort: Options versus Post-editing},
  author={Donald Sturgeon and John Sie Yuen Lee},
Past research has shown that various types of computer assistance can reduce translation effort and improve translation quality over manual translation. This paper directly compares two common assistance types – selection from lists of translation options, and postediting of machine translation (MT) output produced by Google Translate – across two significantly different subject domains for Chinese-to-English translation. In terms of translation effort, we found that the use of options can… 

Figures and Tables from this paper

Preference learning for machine translation
Algorithms that can learn from very large amounts of data by exploiting pairwise preferences defined over competing translations are developed, which can be used to make a machine translation system robust to arbitrary texts from varied sources, but also enable it to learn effectively to adapt to new domains of data.


The efficacy of human post-editing for language translation
It is found that post-editing leads to reduced time and, surprisingly, improved quality for three diverse language pairs (English to Arabic, French, and German).
A Process Study of Computed Aided Translation
The computer aided tool Caitra is developed that makes suggestions for sentence completion, shows word and phrase translation options, and allows postediting of machine translation output.
Comparison of post-editing productivity between professional translators and lay users
Results suggest that overall, post-editing increases translation throughput for both translators and users, although the latter seem to benefit more from the MT output.
Towards predicting post-editing productivity
Correlations between general text matcher and translation edit rate and post-editing productivity and segments with high GTM and TER scores and cognitive measures of effort are investigated.
Post-editing time as a measure of cognitive effort
This paper presents two experiments investigating the connection between post-editing time and cognitive effort, and examines whether sentences with long and short post-edsiting times involve edits of different levels of difficulty.
Comparing Forum Data Post-Editing Performance Using Translation Memory and Machine Translation Output: A Pilot Study
The results from a pilot study undertaken with translation students to compare community forum content post-editing performance based on suggestions from different translation systems showed that post-edited MT output obtained higher results on each of the variables measured.
This translation is not too bad: an analysis of post-editor choices in a machine-translation post-editing task
A postediting task involving controlled language tourist phrases translated from English into Finnish is described, where post-editors select the best out of three machine translated suggestions, which they can accept without editing or post-edit as necessary.
Productivity and quality in MT post-editing
Results suggest that translators have higher productivity and quality when using machinetranslated output than when processing fuzzy matches from translation memories, and technical experience seems to have an impact on productivity but not on quality.
Assessing post-editing efficiency in a realistic translation environment
It is found that post-editing reduces translation time significantly, although considerably less than reported in isolated experiments, and it is argued that overall assessments of post-Editing efficiency should be based on a realistic translation environment.
Repairing Texts: Empirical Investigations of Machine Translation Post-Editing Processes
In Repairing Texts, Hans P. Krings challenges the idea that, given the effectiveness of machine translation, major costs could be reduced by using monolingual staff to post -- edit translations.