Chrome Extension
WeChat Mini Program
Use on ChatGLM

DDK: Distilling Domain Knowledge for Efficient Large Language Models

NeurIPS 2024(2024)

Cited 7|Views37
Key words
Knowledge Distillation,Large Language Models,Model Acceralation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined