EventGround: Narrative Reasoning by Grounding to Eventuality-centric Knowledge Graphs
International Conference on Computational Linguistics(2024)
摘要
Narrative reasoning relies on the understanding of eventualities in story
contexts, which requires a wealth of background world knowledge. To help
machines leverage such knowledge, existing solutions can be categorized into
two groups. Some focus on implicitly modeling eventuality knowledge by
pretraining language models (LMs) with eventuality-aware objectives. However,
this approach breaks down knowledge structures and lacks interpretability.
Others explicitly collect world knowledge of eventualities into structured
eventuality-centric knowledge graphs (KGs). However, existing research on
leveraging these knowledge sources for free-texts is limited. In this work, we
propose an initial comprehensive framework called EventGround, which aims to
tackle the problem of grounding free-texts to eventuality-centric KGs for
contextualized narrative reasoning. We identify two critical problems in this
direction: the event representation and sparsity problems. We provide simple
yet effective parsing and partial information extraction methods to tackle
these problems. Experimental results demonstrate that our approach consistently
outperforms baseline models when combined with graph neural network (GNN) or
large language model (LLM) based graph reasoning models. Our framework,
incorporating grounded knowledge, achieves state-of-the-art performance while
providing interpretable evidence.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要