Leveraging Bilinear Attention to Improve Spoken Language Understanding

IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)(2022)

引用 12|浏览18
暂无评分
摘要
Spoken language understanding system (SLU) typically includes two tasks: Intent detection (ID) and Slot filling (SF). Optimizing these two tasks in an interactive way with attention mechanism has been shown effective. However, previous attention-based works leveraged only the first order attention design, which is lacking in efficacy. To trigger more adequate information interaction between the input intent or slot features, we propose a novel framework with Bilinear attention, which can build the second order feature interactions. By stacking numerous Bilinear attention modules and equipping the Exponential Linear Unit activation, it can build higher and infinity order feature interactions. To demonstrate the effectiveness of the proposed framework, we conduct some experiments on two benchmark datasets, i.e., SNIPS and ATIS. And the experimental results show that our framework ismore competitive than multiple baselines as well as the first order attention model.
更多
查看译文
关键词
Spoken Language Understanding,Bilinear Attention,Features Interaction,Multitask Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要