SeTransformer: A Transformer-Based Code Semantic Parser for Code Comment Generation

IEEE Transactions on Reliability(2023)

引用 7|浏览38
暂无评分
摘要
Automated code comment generation technologies can help developers understand code intent, which can significantly reduce the cost of software maintenance and revision. The latest studies in this field mainly depend on deep neural networks, such as convolutional neural networks and recurrent neural network. However, these methods may not generate high-quality and readable code comments due to the long-term dependence problem, which means that the code blocks used to summarize information are far from each other. Owing to the long-term dependence problem, these methods forget the previous input data’s feature information during the training process. In this article, to solve the long-term dependence problem and extract both the text and structure information from the program code, we propose a novel improved-Transformer-based comment generation method, named SeTransformer. Specifically, the SeTransformer utilizes the code tokens and an abstract syntax tree (AST) of programs to extract information as the inputs, and then, it leverages the self-attention mechanism to analyze the text and structural features of code simultaneously. Experimental results based on public corpus gathered from large-scale open-source projects show that our method can significantly outperform five state-of-the-art baselines (such as Hybrid-DeepCom and AST-attendgru). Furthermore, we also conduct a questionnaire survey for developers, and the results show that the SeTransformer can generate higher quality comments than those of other baselines.
更多
查看译文
关键词
Code comment generation,convolutional neural network (CNN),deep learning,program comprehension,Transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要