Emotion Detection with Transformers: A Comparative Study
CoRR(2024)
摘要
In this study, we explore the application of transformer-based models for
emotion classification on text data. We train and evaluate several pre-trained
transformer models, on the Emotion dataset using different variants of
transformers. The paper also analyzes some factors that in-fluence the
performance of the model, such as the fine-tuning of the transformer layer, the
trainability of the layer, and the preprocessing of the text data. Our analysis
reveals that commonly applied techniques like removing punctuation and stop
words can hinder model performance. This might be because transformers strength
lies in understanding contextual relationships within text. Elements like
punctuation and stop words can still convey sentiment or emphasis and removing
them might disrupt this context.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要