Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Towards document-level human MT evaluation: on the issues of annotator agreement, effort and misevaluation

Castilho, Sheila orcid logoORCID: 0000-0002-8416-6555 (2021) Towards document-level human MT evaluation: on the issues of annotator agreement, effort and misevaluation. In: 16th Conference of the European Chapter of the Association for Computational Linguistics - EACL 2021., 19-23 April 2021, Online.

Document-level human evaluation of machine translation (MT) has been raising interest in the community. However, little is known about the issues of using document-level methodologies to assess MT quality. In this article, we compare the inter-annotator agreement (IAA) scores, the effort to assess the quality in different document-level methodologies, and the issue of misevaluation when sentences are evaluated out of context.
Item Type:Conference or Workshop Item (Speech)
Event Type:Conference
Uncontrolled Keywords:machine translation evaluation; document-level MT evlauation; human evaluation
Subjects:Computer Science > Machine translating
Humanities > Translating and interpreting
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Institutes and Centres > ADAPT
Published in: Proceedings of the Workshop on Human Evaluation of NLP Systems (HumEval) 2021. . Association for Computational Linguistics (ACL).
Publisher:Association for Computational Linguistics (ACL)
Official URL:https://aclanthology.org/2021.humeval-1.4
Copyright Information:© 2021 The Author (CC-BY-4.0)
Funders:Irish Research Ccouncil, European Association for Machine Translation (EAMT)
ID Code:25710
Deposited On:06 Apr 2021 11:34 by Dr Sheila Castilho M de Sousa . Last Modified 16 Jan 2023 16:18

Full text available as:

[thumbnail of EACL_2021_SheilaCastilho_doclevel eval.pdf]
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader


Downloads per month over past year

Archive Staff Only: edit this record