Everything of Thoughts: Defying the Law of Penrose Triangle for Thought Generation
CoRR(2023)
摘要
Recent advancements in Large Language Models (LLMs) have revolutionized
decision-making by breaking down complex problems into more manageable language
sequences referred to as "thoughts". An effective thought design should
consider three key perspectives: performance, efficiency, and flexibility.
However, existing thought can at most exhibit two of these attributes. To
address these limitations, we introduce a novel thought prompting approach
called "Everything of Thoughts" (XoT) to defy the law of "Penrose triangle of
existing thought paradigms. XoT leverages pretrained reinforcement learning and
Monte Carlo Tree Search (MCTS) to incorporate external domain knowledge into
thoughts, thereby enhancing LLMs' capabilities and enabling them to generalize
to unseen problems efficiently. Through the utilization of the MCTS-LLM
collaborative thought revision framework, this approach autonomously produces
high-quality comprehensive cognitive mappings with minimal LLM interactions.
Additionally, XoT empowers LLMs to engage in unconstrained thinking, allowing
for flexible cognitive mappings for problems with multiple solutions. We
evaluate XoT on several challenging multi-solution problem-solving tasks,
including Game of 24, 8-Puzzle, and Pocket Cube. Our results demonstrate that
XoT significantly outperforms existing approaches. Notably, XoT can yield
multiple solutions with just one LLM call, showcasing its remarkable
proficiency in addressing complex problems across diverse domains.
更多查看译文
关键词
penrose triangle,generation,thoughts,thoughts,law
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要