IAN-BERT: Combining Post-trained BERT with Interactive Attention Network for Aspect-Based Sentiment Analysis

SN Comput. Sci.(2023)

引用 0|浏览0
暂无评分
摘要
Aspect-based sentiment analysis (ABSA), a task in sentiment analysis, predicts the sentiment polarity of specific aspects mentioned in the input sentence. Recent research has demonstrated the effectiveness of Bidirectional Encoder Representation from Transformers (BERT) and its variants in improving the performance of various Natural Language Processing (NLP) tasks, including sentiment analysis. However, BERT, trained on Wikipedia and BookCorpus dataset, lacks domain-specific knowledge. Also, for the ABSA task, the Attention mechanism leverages the aspect information to determine the sentiment orientation of the aspect within the given sentence. Based on the abovementioned observations, this paper proposes a novel approach called the IAN-BERT model. The IAN-BERT model leverages attention mechanisms to enhance a post-trained BERT representation trained on Amazon and Yelp datasets. The objective is to capture domain-specific knowledge using BERT representation and identify the significance of context words with aspect terms and vice versa. By incorporating attention mechanisms, the IAN-BERT model aims to improve the model’s ability to extract more relevant and informative features from the input text, ultimately leading to better predictions. Experimental evaluations conducted on SemEval-14 (Restaurant and Laptop dataset) and MAMS dataset demonstrate the effectiveness and superiority of the IAN-BERT model in aspect-based sentiment analysis.
更多
查看译文
关键词
Aspect-based sentiment analysis, Attention mechanism, BERT, Post-trained BERT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要