Retrieval Contrastive Learning for Aspect-Level Sentiment Classification

INFORMATION PROCESSING & MANAGEMENT(2024)

引用 0|浏览8
暂无评分
摘要
Aspect-Level Sentiment Classification (ALSC) aims to assign specific sentiments to a sentence toward different aspects, which is one of the crucial challenges in the field of Natural Language Processing (NLP). Despite numerous approaches being proposed and obtaining prominent results, the majority of them focus on leveraging the relationships between the aspect and opinion words in a single instance while ignoring correlations with other instances, which will make models inevitably become trapped in local optima due to the absence of a global viewpoint. Instance representation derived from a single instance, on the one hand, the contained information is insufficient due to the lack of descriptions from other perspectives; on the other hand, its stored knowledge is redundant since the inability to filter extraneous content. To obtain a polished instance representation, we developed a Retrieval Contrastive Learning (RCL) framework to subtly extract intrinsic knowledge across instances. RCL consists of two modules: (a) obtaining retrieval instances by sparse retriever and dense retriever, and (b) extracting and learning the knowledge of the retrieval instances by using Contrastive Learning (CL). To demonstrate the superiority of RCL, five ALSC models are employed to conduct comprehensive experiments on three widely-known benchmarks. Compared with the baselines, ALSC models achieve substantial improvements when trained with RCL. Especially, ABSA-DeBERTa with RCL obtains new state-of-the-art results, which outperform the advanced methods by 0.92%, 0.23%, and 0.47% in terms of Macro F1 gains on Laptops, Restaurants, and Twitter, respectively.
更多
查看译文
关键词
Natural language processing,Aspect-level sentiment classification,Information retrieval,Contrastive learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要