Improving Sentence Representations With Local And Global Attention For Classification

2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2019)

Cited 2|Views10
No score
Abstract
Representation learning is a key issue for text classification tasks. Few existing representation models are able to learn sufficient text information, including local semantic information and global structure information. This paper focuses on how to generate better semantic and structure representations to obtain better sentence representation with them. In detail, we propose a hierarchical local and global attention network to learn sentence representation automatically. We generate semantic and structure representations respectively with local attention. Global attention is used to get the final representation. The final representation obtained is used for training and prediction. Experimental results show that our method achieves ideal results in several text classification tasks, including sentiment analysis, subjectivity classification and question type classification. The specific accuracies are 81.6%(MR), 93.6%(SUBJ), 49.4%(SST-5) and 95.6%(TREC).
More
Translated text
Key words
subjectivity classification,question type classification,sentence representation,global attention,representation learning,text classification tasks,local semantic information,global structure information,semantic structure representations,hierarchical local attention network
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined