On Exploring Attention-based Explanation for Transformer Models in Text Classification

2021 IEEE International Conference on Big Data (Big Data)(2021)

引用 7|浏览26
暂无评分
摘要
The Transformer models have achieved unprecedented breakthroughs in text classification, and have become the foundation of most state-of-the-art NLP systems. The core function that drives the success is the attention mechanism, which provides the ability to dynamically focus on different parts of the input sequence when producing the predictions. Several previous works have investigated the usage ...
更多
查看译文
关键词
Degradation,Costs,Text categorization,Big Data,Transformer cores,Predictive models,Transformers
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要