Multi-label emotion classification in texts using transfer learning

EXPERT SYSTEMS WITH APPLICATIONS(2023)

引用 16|浏览75
暂无评分
摘要
Social media is a widespread platform that provides a massive amount of user-generated content that can be mined to reveal the emotions of social media users. This has many potential benefits, such as getting a sense of people's pulse on various events or news. Emotion classification from social media posts is challenging, especially when it comes to detecting multiple emotions from a short piece of text, as in multi-label classification problem. Most of the previous work on emotion detection has focused on deep neural networks such as Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) such as Long Short -Term Memory (LSTM) Networks. However, none of them has utilized multiple attention mechanisms and Recurrent Neural Networks (i.e., specialized attention networks for each emotion) nor utilized the recently introduced Transformer Networks such as XLNet, DistilBERT, and RoBERTa for the task of classifying emotions with multiple labels. The proposed multiple attention mechanism reveals the contribution of each word on each emotion, which has not been investigated before. In this study, we investigate both the use of LSTMs and the fine-tuning of Transformer Networks through Transfer Learning along with a single-attention network and a multiple-attention network for multi-label emotion classification. The experimental results show that our novel transfer learning models using pre-trained transformers with and without multiple attention mechanisms were able to outperform the current state-of-the-art accuracy (58.8% -Baziotis et al., 2018) in the SemEval-2018 Task-1C dataset. Our best-performing RoBERTa-MA (RoBERTa-Multi-attention) model outperformed the state-of-the-art and achieved 62.4% accuracy (3.6% gain over the state-of-the-art) on the challenging SemEval-2018 E-c: Detecting Emotions (multi-label classification) dataset for English. Moreover, the XLNet-MA (XLNet-Multi-attention) model outperformed the other proposed models by achieving 45.6% accuracy on the Ren-CECps dataset for Chinese.
更多
查看译文
关键词
Multi-label emotion classification, Bi-LSTM, Transformer Networks, Attention mechanism, Social media
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要