A Deep Context-wise Method for Coreference Detection in Natural Language Requirements

2020 IEEE 28th International Requirements Engineering Conference (RE)(2020)

引用 12|浏览325
暂无评分
摘要
Requirements are usually written by different stakeholders with diverse backgrounds and skills and evolve continuously. Therefore inconsistency caused by specialized jargons and different domains, is inevitable. In particular, entity coreference in Requirement Engineering (RE) is that different linguistic expressions refer to the same real-world entity. It leads to misconception about technical terminologies, and impacts the readability and understandability of requirements negatively. Manual detection entity coreference is labor-intensive and time-consuming. In this paper, we propose a DEEP context-wise semantic method named DeepCoref to entity COREFerence detection. It consists of one fine-tuning BERT model for context representation and a Word2Vec-based network for entity representation. We use a multi-layer perception in the end to fuse and make a trade-off between two representations for obtaining a better representation of entities. The input of the network is requirement contextual text and related entities, and the output is the predictive label to infer whether two entities are coreferent. The evaluation on industry data shows that our approach significantly outperforms three baselines with average precision and recall of 96.10% and 96.06% respectively. We also compare DeepCoref with three variants to demonstrate the performance enhancement from different components.
更多
查看译文
关键词
Requirement engineering,entity coreference,deep learning,fine-tuning BERT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要