DistillSpec: Improving Speculative Decoding Via Knowledge DistillationYongchao Zhou,Kaifeng Lyu,Ankit Singh Rawat,Aditya Krishna Menon,Afshin Rostamizadeh,Sanjiv Kumar,Jean-François Kagy,Rishabh AgarwalICLR 2024(2024)引用 79|浏览31关键词large language model,knowledge distillation,speculative decodingAI 理解论文溯源树样例生成溯源树,研究论文发展脉络Chat Paper正在生成论文摘要