A Sparse Self-Attention Enhanced Model for Aspect-Level Sentiment Classification

Neural Processing Letters(2024)

引用 0|浏览3
暂无评分
摘要
pect based sentiment analysis (ABSA) manifests the well refined work as Aspect-level sentiment classification (ASC) due to recent high attention and profound outcomes. This paper reveals the exorbitant relation of concerned aspect in determining the sentiment polarity of a sentence in addition to their content. Considering an instance, “The food is tasty but restaurant is untidy”, aspect food reveals the polarity as positive, while in case of restaurant, the polarity seems negative. Hence to scout relation between an aspect and context words present in sentence is much vital. In spite of the exceptional progress, the ASC experiences certain pitfalls as (1) The current attention based methods makes the given aspect to falsely relate syntactically unrelated words as related ones. (2) Single context-independent representation is only achieved by traditional Word2Vec or GloVe based embedding vectors. (3) Sentiments for multiple words that are inconsecutive remain insufficient for CNN based models. A Sparse-Self-Attention-based Gated Recurrent Unit with Aspect Embedding (SSA-GRU-AE) implementing BERT for ASC is proposed to solve these issues. The proposed SSA-GRU-AE mechanism is centralized on various portions of sentence as multiple aspects are taken as input. The experimental analysis on ASC datasets has proved that proposed model enhanced performance on ASC.
更多
查看译文
关键词
ASC,ABSA,Self-attention,Sparse mechanism,BERT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要