MGCoT: Multi-Grained Contextual Transformer for table-based text generation

Expert Systems with Applications(2024)

引用 0|浏览1
暂无评分
摘要
Recent advances in Transformer have led to the revolution of table-based text generation. However, most existing Transformer-based architectures ignore the rich contexts among input tokens distributed in multi-level units (e.g., cell, row, or column), leading to sometimes unfaithful text generation that fails to establish accurate association relationships and misses vital information. In this paper, we propose Multi-Grained Contextual Transformer (MGCoT), a novel architecture that fully capitalizes on the multi-grained contexts among input tokens and thus strengthens the capacity of table-based text generation. The key primitive, Multi-Grained Contexts (MGCo) module, involves two components: a local context sub-module that adaptively gathers neighboring tokens to form the token-wise local context features, and a global context sub-module that consistently aggregates tokens from a broader range to form the shared global context feature. The former aims at modeling the short-range dependencies that reflect the salience of tokens within similar fine-grained unit (e.g., cell and row) attending to the query token, while the latter aims at capturing the long-range dependencies that reflect the significance of each token within similar coarse-grained unit (e.g., multiple rows or columns). Based on the fused multi-grained contexts, MGCoT can flexibly and holistically model the content of a table across multi-level structures. On three benchmark datasets, ToTTo, FeTaQA, and Tablesum, MGCoT outperforms strong baselines by a large margin on the quality of the generated texts, demonstrating the effectiveness of multi-grained context modeling. Our source codes are available at https://anonymous.4open.science/r/MGCoT-3BED.
更多
查看译文
关键词
Multi-grained contexts,Transformer,Abstractive table question answering,Table-to-text generation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要