DSPy Assertions: Computational Constraints for Self-Refining Language Model Pipelines
CoRR(2023)
摘要
Chaining language model (LM) calls as composable modules is fueling a new way
of programming, but ensuring LMs adhere to important constraints requires
heuristic "prompt engineering". We introduce LM Assertions, a programming
construct for expressing computational constraints that LMs should satisfy. We
integrate our constructs into the recent DSPy programming model for LMs, and
present new strategies that allow DSPy to compile programs with LM Assertions
into more reliable and accurate systems. We also propose strategies to use
assertions at inference time for automatic self-refinement with LMs. We report
on four diverse case studies for text generation and find that LM Assertions
improve not only compliance with imposed rules but also downstream task
performance, passing constraints up to 164
more higher-quality responses. Our reference implementation of LM Assertions is
integrated into DSPy at https://github.com/stanfordnlp/dspy
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要