Category-learning attention mechanism for short text filtering

Neurocomputing(2022)

Cited 2|Views1
No score
Abstract
•A novel category-learning attention mechanism is proposed.•It includes category-learning scaled-dot-product attention mechanism.•It includes category-learning multi-head attention mechanism (CL-MHA).•The attention mechanisms build category differentiation feature matrices dynamically.•The CL-MHA based bidirectional gate recurrent unit achieves the best performance.
More
Translated text
Key words
Short message classification,Multi-head attention,Bidirectional gate recurrent unit,Category-learning attention,Category-learning multi-head attention
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined