谷歌浏览器插件
订阅小程序
在清言上使用

Distilling Step-by-Step! Outperforming Larger Language Models with Less Training Data and Smaller Model Sizes.

Annual Meeting of the Association for Computational Linguistics(2023)

引用 509|浏览1125
关键词
Language Modeling,Natural Language Processing,Topic Modeling,Lexical Simplification,Syntax-based Translation Models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要