Densely Knowledge-Aware Network for Multivariate Time Series Classification

IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS(2024)

引用 0|浏览18
暂无评分
摘要
Multivariate time series classification (MTSC) based on deep learning (DL) has attracted increasingly more research attention. The performance of a DL-based MTSC algorithm is heavily dependent on the quality of the learned representations providing semantic information for downstream tasks, e.g., classification. Hence, a model's representation learning ability is critical for enhancing its performance. This article proposes a densely knowledge-aware network (DKN) for MTSC. The DKN's feature extractor consists of a residual multihead convolutional network (ResMulti) and a transformer-based network (Trans), called ResMulti-Trans. ResMulti has five residual multihead blocks for capturing the local patterns of data while Trans has three transformer blocks for extracting the global patterns of data. Besides, to enable dense mutual supervision between lower-and higher-level semantic information, this article adapts densely dual self-distillation (DDSD) for mining rich regularizations and relationships hidden in the data. Experimental results show that compared with 5 state-of-the-art self-distillation variants, the proposed DDSD obtains 13/4/13 in terms of "win"/"tie"/"lose" and gains the lowest-AVG_rank score. In particular, compared with pure ResMulti-Trans, DKN results in 20/1/9 regarding win/tie/lose. Last but not least, DKN overweighs 18 existing MTSC algorithms on 10 UEA2018 datasets and achieves the lowest-AVG_rank score.
更多
查看译文
关键词
Data mining,deep learning (DL),knowledge distillation (KD),multivariate time series classification (MTSC),transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要