BaGFN: Broad Attentive Graph Fusion Network for High-Order Feature Interactions

IEEE transactions on neural networks and learning systems(2023)

引用 9|浏览24
暂无评分
摘要
Modeling feature interactions is of crucial significance to high-quality feature engineering on multifiled sparse data. At present, a series of state-of-the-art methods extract cross features in a rather implicit bitwise fashion and lack enough comprehensive and flexible competence of learning sophisticated interactions among different feature fields. In this article, we propose a new broad attentive graph fusion network (BaGFN) to better model high-order feature interactions in a flexible and explicit manner. On the one hand, we design an attentive graph fusion module to strengthen high-order feature representation under graph structure. The graph-based module develops a new bilinear-cross aggregation function to aggregate the graph node information, employs the self-attention mechanism to learn the impact of neighborhood nodes, and updates the high-order representation of features by multihop fusion steps. On the other hand, we further construct a broad attentive cross module to refine high-order feature interactions at a bitwise level. The optimized module designs a new broad attention mechanism to dynamically learn the importance weights of cross features and efficiently conduct the sophisticated high-order feature interactions at the granularity of feature dimensions. The final experimental results demonstrate the effectiveness of our proposed model.
更多
查看译文
关键词
Feature extraction,Frequency modulation,Data models,Predictive models,Learning systems,Aggregates,Transforms,Attention mechanism,broad learning system (BLS),feature interactions,graph neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要