Few-shot relation classification based on the BERT model, hybrid attention and fusion networks

APPLIED INTELLIGENCE(2023)

Cited 0|Views23
No score
Abstract
Relation classification (RC) is an essential task in information extraction. The distance supervision (DS) method can use many unlabeled data and solve the lack of training data on the RC task. However, the DS method has the problems of long tails and noise. Intuitively, people can solve these problems using few-shot learning (FSL). Our work aims to improve the accuracy and rapidity of convergence on the few-shot RC task. We believe that entity pairs have an essential role in the few-shot RC task. We propose a new context encoder, which is improved based on the bidirectional encoder representations from transformers (BERT) model to fuse entity pairs and their dependence information in instances. At the same time, we design hybrid attention, which includes support instance-level and query instance-level attention. The support instance level dynamically assigns the weight of each instance in the support set. It makes up for the insufficiency of prototypical networks, which distribute weights to sentences equally. Query instance-level attention is dynamically assigned weights to query instances by similarity with the prototype. The ablation study shows the effectiveness of our proposed method. In addition, a fusion network is designed to replace the Euclidean distance method of previous works when class matching is performed, improving the convergence’s rapidity. This makes our model more suitable for industrial applications. The experimental results show that the proposed model’s accuracy is better than that of several other models.
More
Translated text
Key words
Relation classification,Few-shot learning,BERT,Attention,Rapidity of convergence
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined