Experiments with text-to-SPARQL based on ChatGPT.

IEEE International Conference on Semantic Computing(2024)

引用 0|浏览1
暂无评分
摘要
Currently, large language models (LLMs) are the state of the art for pre-trained language models. LLMs have been applied to many tasks, including question and answering over Knowledge Graphs (KGs) and text-to-SPARQL, that is, the translation of Natural Language questions to SPARQL queries. With such motivation, this paper first describes preliminary experiments to evaluate the ability of ChatGPT to answer NL questions over KGs. Based on these experiments, the paper introduces Auto-KGQAGPT, an autonomous domain-independent framework based on LLMs for text-to-SPARQL. The framework selects fragments of the KG, which the LLM uses to translate the user’s NL question to a SPARQL query on the KG. Finally, the paper describes preliminary experiments with Auto-KGQAGPT with ChatGPT that indicate that the framework substantially reduced the number of tokens passed to ChatGPT without sacrificing performance.
更多
查看译文
关键词
text-to-SPARQL,ChatGPT,LLM,Knowledge Graph
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要