Pre-Training a Neural Model to Overcome Data Scarcity in Relation Extraction from Text.

BigComp(2019)

引用 0|浏览11
暂无评分
摘要
Data scarcity is a major stumbling block in relation extraction. We propose an unsupervised pre-training method for extracting relational information from a huge amount of unlabeled data prior to supervised learning in the situation where hard to make golden labeled data. An objective function not requiring any labeled data is adopted during the pre-training phase, with an attempt to predict clue words crucial for inferring semantic relation types between two entities in a given sentence. The experimental result on public datasets shows that our approach achieves similar performance by using only 70% of data in a data-scarce setting.
更多
查看译文
关键词
Task analysis,Training,Training data,Data models,Data mining,Semantics,Supervised learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要