Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Informative manual evaluation of machine translation output

Popović, Maja orcid logoORCID: 0000-0001-8234-8745 (2020) Informative manual evaluation of machine translation output. In: 28th International Conference on Computational Linguistics (COLING 2020), 8-13 Dec 2020, Barcelona, Spain (Online).

Abstract
This work proposes a new method for manual evaluation of Machine Translation (MT) output based on marking actual issues in the translated text. The novelty is that the evaluators are not assigning any scores, nor classifying errors, but marking all problematic parts (words, phrases, sentences) of the translation. The main advantage of this method is that the resulting annotations do not only provide overall scores by counting words with assigned tags, but can be further used for analysis of errors and challenging linguistic phenomena, as well as inter-annotator disagreements. Detailed analysis and understanding of actual problems are not enabled by typical manual evaluations where the annotators are asked to assign overall scores or to rank two or more translations. The proposed method is very general: it can be applied on any genre/domain and language pair, and it can be guided by various types of quality criteria. Also, it is not restricted to MT output, but can be used for other types of generated text.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Subjects:Computer Science > Machine translating
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Institutes and Centres > ADAPT
Published in: Proceedings of the 28th International Conference on Computational Linguistics. . International Committee on Computational Linguistics.
Publisher:International Committee on Computational Linguistics
Official URL:https://doi.org/10.18653/v1/2020.coling-main.444
Copyright Information:© 2021 The Author.
Funders:European Association for Machine Translation (EAMT) under its programme “2019 Sponsorship of Activities” at the ADAPT Research Centre at Dublin City University, Science Foundation Ireland through the SFI Research Centres Programme Grant 13/RC/2106, European Regional Development Fund (ERDF)
ID Code:28353
Deposited On:23 May 2023 11:13 by Maja Popovic . Last Modified 31 May 2023 14:11
Documents

Full text available as:

[thumbnail of 2020.coling-main.444.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution 4.0
163kB
Metrics

Altmetric Badge

Dimensions Badge

Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record