Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

The 2022 ReproGen shared task on reproducibility of evaluations in NLG: overview and results

Belz, Anya orcid logoORCID: 0000-0002-0552-8096, Shimorina, Anastasia, Popović, Maja orcid logoORCID: 0000-0001-8234-8745 and Reiter, Ehud orcid logoORCID: 0000-0002-7548-9504 (2022) The 2022 ReproGen shared task on reproducibility of evaluations in NLG: overview and results. In: 15th International Conference on Natural Language Generation: Generation Challenges, 17-22 July 2022, Waterville, ME, USA.

Abstract
Against a background of growing interest in reproducibility in NLP and ML, and as part of an ongoing research programme designed to develop theory and practice of reproducibility assessment in NLP, we organised the second shared task on reproducibility of evaluations in NLG, ReproGen 2022. This paper describes the shared task, summarises results from the reproduction studies submitted, and provides further comparative analysis of the results. Out of six initial team registrations, we received submissions from five teams. Meta-analysis of the five reproduction studies revealed varying degrees of reproducibility, and allowed further tentative conclusions about what types of eval- uation tend to have better reproducibility.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Subjects:Computer Science > Computational linguistics
Computer Science > Machine learning
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Institutes and Centres > ADAPT
Published in: Proceedings of the 15th International Conference on Natural Language Generation: Generation Challenges. . Association for Computational Linguistics (ACL).
Publisher:Association for Computational Linguistics (ACL)
Official URL:https://aclanthology.org/2022.inlg-genchal.8
Copyright Information:© 2022 Association for Computational Linguistics
Funders:ReproHum project on Investigating Reproducibility of Human Evaluations in Natural Language Processing, funded by EPSRC (UK) under grant number EP/V05645X/1., ADAPT ´ SFI Centre for Digital Media Technology which is funded by Science Foundation Ireland under Grant 13/RC/2106., European Regional Development Fund (ERDF)
ID Code:28369
Deposited On:25 May 2023 14:50 by Maja Popovic . Last Modified 04 Jul 2023 10:39
Documents

Full text available as:

[thumbnail of 2022.inlg-genchal.8.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution 4.0
224kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record