A Transformer-BERT Integrated Model-Based Automatic Conversation Method Under English Context

Xing'an Li, Tangfa Liu, Longlong Zhang,Fayez Alqahtani,Amr Tolba

IEEE ACCESS(2024)

引用 0|浏览1
暂无评分
摘要
The contextual understanding ability in complex conversation scenarios has been a challenging issue, and existing methods mostly failed to possess such characteristics. To bridge such gap, this paper formulates a novel composite large language model to investigate such issue. As a result, taking English context as the scene, a Transformer-BERT integrated model-based automatic conversation model is proposed in this work. Firstly, the unidirectional BERT-based automatic conversation model is improved by introducing attention mechanism. It is expected to enhance feature expression for conversation texts by linking context to identify long-difficult sentences. Besides, a bidirectional Transformer encoder is utilized as the input layer before the BERT encoder. Through the two modules, dynamic language training based on English situational conversations can be completed to build the automatic conversation model. The proposed conversation model is further assessed on massive real-world English language context in terms of conversation performance. The experimental results show that compared with traditional rule-based or machine learning methods, the proposal has significantly improved response quality and fluency in English context. It can more accurately understand context, capture subtle semantic differences, and generate more coherent responses.
更多
查看译文
关键词
Large language model,automatic conversation,semantic context,natural language processing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要