Relation Extraction Based on BERT and BGRU in the Chinese Music Scene
International Conference on Knowledge-Based Intelligent Information & Engineering Systems(2023)
摘要
Due to the lack of music knowledge extraction tools and public datasets, there are relatively few studies on music entity relation extraction. The Bidirectional Encoder Representations from Transformer (BERT) model considers contextual information and provides more dynamic, richer vector representations of generated words. Bidirectional Gated Recurrent Units (BGRU) has longdistance learning capability, which can obtain rich contextual semantic feature. Therefore, this paper proposes a deep learning model based on BERT and BGRU combined with an attention mechanism for Chinese music relation extraction. Experimental results show that the proposed method obtains good results when extracting the relation between musical entities.
更多查看译文
关键词
Relation Extraction,Bidirectional Encoder Representations from Transformer(BERT),Gated Recurrent Units (GRU),Knowledge Graph,Deep Learning
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要