ProSwitch: Knowledge-Guided Instruction Tuning to Generate Professional and Non-Professional Styled Text
arxiv(2024)
摘要
Large Language Models (LLMs) have demonstrated efficacy in various linguistic
applications, including text summarization and controlled text generation.
However, studies into their capacity of switching between styles via
fine-tuning remain underexplored. This study concentrates on textual
professionalism and introduces a novel methodology, named ProSwitch, which
equips a language model with the ability to produce both professional and
non-professional responses through knowledge-guided instruction tuning.
ProSwitch unfolds across three phases: data preparation for gathering domain
knowledge and training corpus; instruction tuning for optimizing language
models with multiple levels of instruction formats; and comprehensive
evaluation for assessing the professionalism discrimination and reference-based
quality of generated text. Comparative analysis of ProSwitch against both
general and specialized language models reveals that our approach outperforms
baselines in switching between professional and non-professional text
generation.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要