Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Exploring variation of results from different experimental conditions

Popović, Maja orcid logoORCID: 0000-0001-8234-8745, Arvan, Mohammad, Parde, Natalie orcid logoORCID: 0000-0003-0072-7499 and Belz, Anya orcid logoORCID: 0000-0002-0552-8096 (2023) Exploring variation of results from different experimental conditions. In: ACL 2023, 9 - 14 July 2023, Toronto, Canada.

Abstract
It might reasonably be expected that running multiple experiments for the same task using the same data and model would yield very similar results. Recent research has, however, shown this not to be the case for many NLP experiments. In this paper, we report extensive coordinated work by two NLP groups to run the training and testing pipeline for three neural text simplification models under varying experimental conditions, including different random seeds, run-time environments, and dependency versions, yielding a large number of results for each of the three models using the same data and train/dev/test set splits. From one perspective, these results can be interpreted as shedding light on the reproducibility of evaluation results for the three NTS models, and we present an in-depth analysis of the variation observed for different combinations of experimental conditions. From another perspective, the results raise the question of whether the averaged score should be considered the ‘true’ result for each model.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Subjects:Computer Science > Computational linguistics
Computer Science > Machine learning
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Research Institutes and Centres > ADAPT
Published in: Rogers, Anna, Okazaki, Naoaki and Boyd-Graber, Jordan, (eds.) Findings of the Association for Computational Linguistics: ACL 2023. . Association for Computational Linguistics (ACL).
Publisher:Association for Computational Linguistics (ACL)
Official URL:https://aclanthology.org/2023.findings-acl.172
Copyright Information:© 2023 Association for Computational Linguistics.
Funders:Science Foundation Ireland under Grant Agreement No.13/RC/2106_P2 at the ADAPT SFI Research Centre at Dublin City University
ID Code:28689
Deposited On:12 Jul 2023 10:27 by Maja Popovic . Last Modified 08 Mar 2024 12:21
Documents

Full text available as:

[thumbnail of ACL_2023_Reproducing_NTS.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution-Noncommercial-No Derivative Works 4.0
177kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record