EDT: Improving Large Language Models' Generation by Entropy-based Dynamic Temperature Sampling
arxiv(2024)
摘要
Recently, Large Language Models (LLMs) have demonstrated outstanding
performance across a wide range of downstream language tasks. Temperature
sampling is a commonly used decoding strategy for LLMs' generation process.
However, a fixed temperature parameter is used in most cases, which may not
always be an optimal choice for balancing generation quality and diversity. In
this paper, we propose an effective Entropy-based Dynamic Temperature (EDT)
Sampling method, to achieve a more balanced performance in terms of both
generation quality and diversity by dynamically selecting the temperature
parameter. Additionally, we also show model performance and comprehensive
analyses for 4 different generation benchmarks. Our experiments show that EDT
significantly outperforms the existing strategies across different tasks.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要