DTS-SQL: Decomposed Text-to-SQL with Small Large Language Models
CoRR(2024)
摘要
Leading models for the text-to-SQL task heavily rely on proprietary Large
Language Models (LLMs), posing concerns over data privacy. Closing the
performance gap between small open-source models and large proprietary models
is crucial to mitigate this reliance. To this end, we introduce a novel
two-stage fine-tuning approach that decomposes the task into two simpler tasks.
Through comprehensive evaluation on two large cross-domain datasets and two
small LLMs, we show that this approach improves execution accuracy by 3 to 7
percent, effectively aligning the performance of open-source models with their
proprietary counterparts.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要