Skip to main content Skip to main navigation

Publication

Involving Language Professionals in the Evaluation of Machine Translation

Eleftherios Avramidis; Aljoscha Burchardt; Christian Federmann; Maja Popovic; Cindy Tscherwinka; David Vilar Torres
In: 8th ELRA Conference on Language Resources and Evaluation. International Conference on Language Resources and Evaluation (LREC-12), May 23-25, Istanbul, Turkey, European Language Resources Association (ELRA), 2012.

Abstract

Significant breakthroughs in machine translation only seem possible if human translators are taken into the loop. While automatic evaluation and scoring mechanisms such as BLEU have enabled the fast development of systems, it is not clear how systems can meet real-world (quality) requirements in industrial translation scenarios today. TARA X U project paves the way for wide usage of hybrid machine translation outputs through various feedback loops in system development. In a consortium of research and industry partners, TARAXU project integrates human translators into the development process for rating and post-editing of machine translation outputs collecting feedback for possible improvements.

Projects

More links