Mgformer: Multi-group transformer for multivariate time series classification

Jianfeng Wen, Nan Zhang, Xuzhe Lu, Zhongyi Hu, Hui Huang

Engineering Applications of Artificial Intelligence(2024)

引用 0|浏览8
暂无评分
摘要
Multivariate time series classification (MTSC) is a crucial task in data science, providing a foundation for analyzing and predicting complex, multi-dimensional data patterns. However, traditional MTSC methods are challenging to handle high-dimensional data effectively and necessitate complex feature engineering. Although deep learning methods have shown excellent performance in handling high-dimensional data, they have difficulty learning diverse temporal patterns in multivariate time series (MTS) and fail to capture deep channel-wise correlations. To this end, we propose a novel MTSC model based on Transformers named Mgformer, which has two basic structures, i.e., the multi-group Transformer module and the channel attention mask module. The multi-group Transformer module combines temporal patching and multiple groups of Transformers to learn complex with diverse temporal patterns at different scales. An attention masking strategy is employed by the channel attention mask module to improve the model’s capacity to learn channel-wise correlations and decrease information loss during training. Experimental results on 27 benchmark MTS datasets show that Mgformer is better than state-of-the-art MTSC methods.
更多
查看译文
关键词
Multivariate time series classification,Multi-group transformer,Temporal patching,Attention mask
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要