Towards Grounded Natural Language Proof Generation

semanticscholar(2021)

引用 1|浏览17
暂无评分
摘要
When a student is working on a mathematical proof, it is often helpful to receive suggestions about how to proceed. To this end, we provide an initial study of two generation tasks in natural mathematical language: suggesting the next step in a proof, and full-proof generation. As proofs are grounded in past results– e.g. theorems, definitions– we study knowledge-grounded generation methods, and find that conditioning on retrieved or ground-truth knowledge greatly improves generations. We characterize error types and provide directions for future research.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要