Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Exploiting rich textual user-product context for improving personalized sentiment analysis

Lyu, Chenyang orcid logoORCID: 0009-0002-6733-5879, Yang, Linyi orcid logoORCID: 0000-0003-0667-7349, Zhang, Yue orcid logoORCID: 0000-0002-5214-2268, Graham, Yvette orcid logoORCID: 0000-0001-6741-4855 and Foster, Jennifer orcid logoORCID: 0000-0002-7789-4853 (2023) Exploiting rich textual user-product context for improving personalized sentiment analysis. In: Findings of the Association for Computational Linguistics: ACL 2023, 9-14 July 2023, Toronto, Canada.

Abstract
User and product information associated with a review is useful for sentiment polarity prediction. Typical approaches incorporating such information focus on modeling users and products as implicitly learned representation vectors. Most do not exploit the potential of historical reviews, or those that currently do require unnecessary modifications to model architectureor do not make full use of user/product associations. The contribution of this work is twofold: i) a method to explicitly employ historical reviews belonging to the same user/product in initializing representations, and ii) efficient incorporation of textual associations between users and products via a user-product cross-context module. Experiments on the IMDb, Yelp-2013 and Yelp-2014 English benchmarks with BERT, SpanBERT and Longformer pretrained language models show that our approach substantially outperforms previous state-of-the-art.
Metadata
Item Type:Conference or Workshop Item (Paper)
Event Type:Conference
Refereed:Yes
Subjects:Computer Science > Artificial intelligence
Computer Science > Computational linguistics
Computer Science > Machine learning
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Computing
Published in: Findings of the Association for Computational Linguistics: ACL 2023. . Association for Computational Linguistics (ACL).
Publisher:Association for Computational Linguistics (ACL)
Official URL:https://doi.org/10.18653/v1/2023.findings-acl.92
Copyright Information:© 2023 Association for Computational Linguistics
Funders:Science Foundation Ireland through the SFI Centre for Research Training in Machine Learning (18/CRT/6183).
ID Code:29140
Deposited On:18 Oct 2023 13:02 by Jennifer Foster . Last Modified 18 Oct 2023 13:02
Documents

Full text available as:

[thumbnail of 2023.findings-acl.92.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Creative Commons: Attribution 4.0
821kB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record