Logic Query of Thoughts: Guiding Large Language Models to Answer Complex Logic Queries with Knowledge Graphs
arxiv(2024)
摘要
Despite the superb performance in many tasks, large language models (LLMs)
bear the risk of generating hallucination or even wrong answers when confronted
with tasks that demand the accuracy of knowledge. The issue becomes even more
noticeable when addressing logic queries that require multiple logic reasoning
steps. On the other hand, knowledge graph (KG) based question answering methods
are capable of accurately identifying the correct answers with the help of
knowledge graph, yet its accuracy could quickly deteriorate when the knowledge
graph itself is sparse and incomplete. It remains a critical challenge on how
to integrate knowledge graph reasoning with LLMs in a mutually beneficial way
so as to mitigate both the hallucination problem of LLMs as well as the
incompleteness issue of knowledge graphs. In this paper, we propose
'Logic-Query-of-Thoughts' (LGOT) which is the first of its kind to combine LLMs
with knowledge graph based logic query reasoning. LGOT seamlessly combines
knowledge graph reasoning and LLMs, effectively breaking down complex logic
queries into easy to answer subquestions. Through the utilization of both
knowledge graph reasoning and LLMs, it successfully derives answers for each
subquestion. By aggregating these results and selecting the highest quality
candidate answers for each step, LGOT achieves accurate results to complex
questions. Our experimental findings demonstrate substantial performance
enhancements, with up to 20
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要