Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

On reporting scores and agreement for error annotation tasks

Popović, Maja orcid logoORCID: 0000-0001-8234-8745 and Belz, Anya orcid logoORCID: 0000-0002-0552-8096 (2022) On reporting scores and agreement for error annotation tasks. In: 2nd Workshop on Natural Language Generation, Evaluation, and Metrics (GEM), 7 Dec 2022, Abu Dhabi, United Arab Emirates & Online.

Abstract
This work examines different ways of aggregating scores for error annotation in MT outputs: raw error counts, error counts normalised over total number of words (word percentage'), and error counts normalised over total number of errors (error percentage'). We use each of these three scores to calculate inter-annotator agreement in the form of Krippendorff's alpha and Pearson's r and compare the obtained numbers, overall and separately for different types of errors. While each score has its advantages depending on the goal of the evaluation, we argue that the best way of estimating inter-annotator agreement using such numbers are raw counts. If the annotation process ensures that the total number of words cannot differ among the annotators (for example, due to adding omission symbols), normalising over number of words will lead to the same conclusions. In contrast, total number of errors is very subjective because different annotators often perceive different amount of errors in the same text, therefore normalising over this number can indicate lower agreements.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Workshop
Refereed:Yes
Subjects:Computer Science > Computational linguistics
DCU Faculties and Centres:Research Institutes and Centres > ADAPT
Published in: Proceedings of the 2nd Workshop on Natural Language Generation, Evaluation, and Metrics (GEM). . Association for Computational Linguistics (ACL).
Publisher:Association for Computational Linguistics (ACL)
Official URL:https://aclanthology.org/2022.gem-1.26
Copyright Information:©2022 Association for Computational Linguistics
Funders:Science Foundation Ireland through the SFI Research Centres Programme and is co-funded under the European Regional Development Fund (ERDF) through Grant 13/RC/2106.
ID Code:28364
Deposited On:24 May 2023 16:02 by Maja Popovic . Last Modified 11 Jul 2023 14:55
Documents

Full text available as:

[thumbnail of 2022.gem-1.26.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution 4.0
208kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record